Author
Pradeep Suresh
05-Jan-2016
Quality Management Recommendation
for
Setting a QA Org
Where we are & where we want to get to
Challenges in current Model
▪ No standard process for quality
▪ No defined metrics for quality health
▪ No standard automation framework and best practices
▪ No governance model for quality
Advantages of Proposed Model
➢Process improvement
» Streamline processes and framework
» Setup governance model
» Test Data and Knowledge base
» Reducing defects in production
➢Quality improvement
» Defined metrics to measure quality
» Increased test coverage & test effectiveness
Recommend systems and tools based on
quality and performance requirements
Discover, evaluate , analyse tools
and processes
Faster go-to Market:: Retain
Knowledge base
Pilot Release Based
execution, Capture metrics
and thrive for improvement
Setup Governance Model
Continuous Process Improvement
Discovery Phase
• Discover current
drawbacks and
capabilities
• Evaluate existing
quality systems and
processes
• Measure current skillset
and knowledge base
• Gather current metrics
as set base
Pilot Phase
Recommend system and
processes based on a
holistic approach
Recommend automation
tool
Set data management
process
Pilot recommended
approach with minimum
risk release
Measure metrics trend
Institutionalizing
Governing Phase
Reduced test execution
effort for regression suite -
achieved through defined
automation f/w
Establish test data COE
with clearly defined roles
& responsibilities
Define (NFR) Non
Functional Requirements
strategy and roadmap
Test case rationalization,
Test automation, Risk
Based Testing and Test
Data Management will be
attained
0 to 3 months 3 to 6 months 6 to 9 months 9 to 12 months
The Journey
Drivers
● Test management
tools
● Test data
management tools
● Automation tools
● Performance tools
People
▪ Test Architects
▪ Test Engineers
▪ SMEs
Executive commitments
Governance and communication framework
Programs
New features
Migration
/upgrade
Implementation
Maintenance
Quality
Management
Reporting
Dashboard
Process &
Quality
Assurance
Metrics & SLA
Benchmarking
Centralized knowledge management
Infrastructure and Support Team
The Framework
Proposed Iterative Testing Methodology
Risk Based Testing Test Automation Test Data Mgmt>
…
▪ Requirement Analysis
▪ Test Requirement
Identification
▪ Release Test scope
▪ Release Test Plan
▪ System & Integration Test Case
/ Scenario Design
▪ Test Data set up
▪ Regression Test Identification
Central Knowledge Repository
Master Test Strategy/ Test Plan
Reusable Test Scenarios / Cases (Test case Repository)
Metrics Dashboard
> …
Release specific Testing
▪ System Testing
▪ Integration Testing
▪ Regression Testing
InitiateKT
forRelease
2
▪ Requirement
Analysis
▪ Test Requirement
Identification
▪ Release Test
scope
▪ Release Test Plan
▪ System Testing
▪ Integration Testing
▪ Regression Testing
KTfor
Release
N
▪ System & Integration Test Case
/ Scenario Design
▪ Test Data set up
▪ Regression Test Identification
▪ Requirement Analysis
▪ Test Requirement
Identification
▪ Release Test
scope
▪ Release Test Plan
▪ System & Integration Test
Case / Scenario Design
▪ Test Data set up
▪ Regression Test Identification
▪ System Testing
▪ Integration Testing
▪ Regression Testing
Release Plan Status Report Risk / Issue Management
TestRequirements
Analysis
TestPlanningTestDesignTestExecution
The Proposed Iterative Methodology
Proposed Team
structure
Core QA Team structure
Automation
Flex Team on-
boarding
Core Team during PilotTest Data Mgmt
Core Team of the Project
Data Management flex Team
Automation flex Team
Execution Team during Pilot Phase
QA Governance Model
7
Project Manager (1)
Vertical head
CEO
QA Director
ORG PROJECT #1 PROJECT #2
Project Manager (2)
ExecutiveManagementPMOProjectOffice
Middle Team
Dev Team
SQA Team
SQA
Lead
Middle Tier
Developer
SQA Lead
Web
Developer
Business
Analyst
Web Developer
Team
Business
Analyst
Dev Lead
Automation
Team
SQA Team
Automation
Team
Note: This model only shows QA hierarchy. The PM here represents QA Manager and the Project may have Dev Manager/Program Manager, etc.
©2016 PayPal Inc. Confidential and proprietary.
Budgeting for any operational, license
procurement, and miscellaneous expenses
Staff provisioning
Staff motivation and engagement
Commitment and support
Action Plan
Executive Management Commitments
Monitoring progress
Roles and Responsibilities
Roles Responsibilities
QA Director
• Responsible for Defining QA strategy, approach and execution in development projects.
• Responsible for Leading and directing the QA leadership team.
• Provide leadership and technical expertise within Test Automation and Quality Assurance.
• Be accountable for the test automation projects, mentor, and provide leadership to the QA automation developers and managers.
• Participate in interviews, induction, training and performance evaluation of all QA leads.
• Provide technical leadership and expertise within the field of Quality Assurance and Testing.
• Ensuring that the development teams adhere to the principles, guidelines and best practices of the QA strategy as defined.
• Focus on continuous QA improvements including usage of appropriate testing tools, test techniques, test automation.
• Building and maintenance of quality standards as well as enforcing technical and testing standards.
• Monitoring of all the QA activities, test results, leaked defects, root cause analysis and identifying areas of improvement. Implement steps
required to improve the processes.
• Gather and present testing metrics and testing activities for the projects to key stakeholders.
• Ensure the proper usage of available tools to gain the maximum benefit of the QA effort. This includes testing tools for functional,
performance, automation, etc.
• Manage training and continuous learning of QA staff by means of short courses, conferences, meetups, certifications, etc.
• Be an escalation point for all matters related to testing and quality assurance and operate as a primary point of contact for the QA teams.
• Direct the development of the QA strategy, methodology, discipline and framework. Driving and improving the QA team in areas of
automated testing and agile testing.
• Provide technical expertise in Test Automation, Testing Methodologies, Testing Processes, Tools and Techniques across the teams.
• Work with QA managers, Development managers and the Software Development Director to develop and execute QA strategies to meet and
exceed department and corporate quality goals
QA Manager
∙ Interface with onsite development Leads
∙ Interface with offshore Test Lead
∙ Gather application knowledge from Avon Business Users
∙ Ensure effective Knowledge Transition to the Offshore test team
∙ Getting clarifications from Avon Business Users
∙ Facilitate sign-off on all deliverables
∙ Provide status and escalate issues
∙ Review deliverables
∙ Set up and drive defect triage meetings
∙ Identify, collect and share all relevant metrics as agreed with Avon stakeholders
QA Engineer
∙ Work with the Business Users to identify testing requirements
∙ Prepare Test cases for testing application functionalities
∙ Design Test cases
∙ Executing the manual test cases
∙ Verify and report test results
∙ Provide inputs to automation team
∙ Provide the regression test cases and test data

Quality Organization framework

  • 1.
    Author Pradeep Suresh 05-Jan-2016 Quality ManagementRecommendation for Setting a QA Org
  • 2.
    Where we are& where we want to get to Challenges in current Model ▪ No standard process for quality ▪ No defined metrics for quality health ▪ No standard automation framework and best practices ▪ No governance model for quality Advantages of Proposed Model ➢Process improvement » Streamline processes and framework » Setup governance model » Test Data and Knowledge base » Reducing defects in production ➢Quality improvement » Defined metrics to measure quality » Increased test coverage & test effectiveness Recommend systems and tools based on quality and performance requirements Discover, evaluate , analyse tools and processes Faster go-to Market:: Retain Knowledge base Pilot Release Based execution, Capture metrics and thrive for improvement Setup Governance Model
  • 3.
    Continuous Process Improvement DiscoveryPhase • Discover current drawbacks and capabilities • Evaluate existing quality systems and processes • Measure current skillset and knowledge base • Gather current metrics as set base Pilot Phase Recommend system and processes based on a holistic approach Recommend automation tool Set data management process Pilot recommended approach with minimum risk release Measure metrics trend Institutionalizing Governing Phase Reduced test execution effort for regression suite - achieved through defined automation f/w Establish test data COE with clearly defined roles & responsibilities Define (NFR) Non Functional Requirements strategy and roadmap Test case rationalization, Test automation, Risk Based Testing and Test Data Management will be attained 0 to 3 months 3 to 6 months 6 to 9 months 9 to 12 months The Journey
  • 4.
    Drivers ● Test management tools ●Test data management tools ● Automation tools ● Performance tools People ▪ Test Architects ▪ Test Engineers ▪ SMEs Executive commitments Governance and communication framework Programs New features Migration /upgrade Implementation Maintenance Quality Management Reporting Dashboard Process & Quality Assurance Metrics & SLA Benchmarking Centralized knowledge management Infrastructure and Support Team The Framework
  • 5.
    Proposed Iterative TestingMethodology Risk Based Testing Test Automation Test Data Mgmt> … ▪ Requirement Analysis ▪ Test Requirement Identification ▪ Release Test scope ▪ Release Test Plan ▪ System & Integration Test Case / Scenario Design ▪ Test Data set up ▪ Regression Test Identification Central Knowledge Repository Master Test Strategy/ Test Plan Reusable Test Scenarios / Cases (Test case Repository) Metrics Dashboard > … Release specific Testing ▪ System Testing ▪ Integration Testing ▪ Regression Testing InitiateKT forRelease 2 ▪ Requirement Analysis ▪ Test Requirement Identification ▪ Release Test scope ▪ Release Test Plan ▪ System Testing ▪ Integration Testing ▪ Regression Testing KTfor Release N ▪ System & Integration Test Case / Scenario Design ▪ Test Data set up ▪ Regression Test Identification ▪ Requirement Analysis ▪ Test Requirement Identification ▪ Release Test scope ▪ Release Test Plan ▪ System & Integration Test Case / Scenario Design ▪ Test Data set up ▪ Regression Test Identification ▪ System Testing ▪ Integration Testing ▪ Regression Testing Release Plan Status Report Risk / Issue Management TestRequirements Analysis TestPlanningTestDesignTestExecution The Proposed Iterative Methodology
  • 6.
    Proposed Team structure Core QATeam structure Automation Flex Team on- boarding Core Team during PilotTest Data Mgmt Core Team of the Project Data Management flex Team Automation flex Team Execution Team during Pilot Phase
  • 7.
    QA Governance Model 7 ProjectManager (1) Vertical head CEO QA Director ORG PROJECT #1 PROJECT #2 Project Manager (2) ExecutiveManagementPMOProjectOffice Middle Team Dev Team SQA Team SQA Lead Middle Tier Developer SQA Lead Web Developer Business Analyst Web Developer Team Business Analyst Dev Lead Automation Team SQA Team Automation Team Note: This model only shows QA hierarchy. The PM here represents QA Manager and the Project may have Dev Manager/Program Manager, etc.
  • 8.
    ©2016 PayPal Inc.Confidential and proprietary. Budgeting for any operational, license procurement, and miscellaneous expenses Staff provisioning Staff motivation and engagement Commitment and support Action Plan Executive Management Commitments Monitoring progress
  • 9.
    Roles and Responsibilities RolesResponsibilities QA Director • Responsible for Defining QA strategy, approach and execution in development projects. • Responsible for Leading and directing the QA leadership team. • Provide leadership and technical expertise within Test Automation and Quality Assurance. • Be accountable for the test automation projects, mentor, and provide leadership to the QA automation developers and managers. • Participate in interviews, induction, training and performance evaluation of all QA leads. • Provide technical leadership and expertise within the field of Quality Assurance and Testing. • Ensuring that the development teams adhere to the principles, guidelines and best practices of the QA strategy as defined. • Focus on continuous QA improvements including usage of appropriate testing tools, test techniques, test automation. • Building and maintenance of quality standards as well as enforcing technical and testing standards. • Monitoring of all the QA activities, test results, leaked defects, root cause analysis and identifying areas of improvement. Implement steps required to improve the processes. • Gather and present testing metrics and testing activities for the projects to key stakeholders. • Ensure the proper usage of available tools to gain the maximum benefit of the QA effort. This includes testing tools for functional, performance, automation, etc. • Manage training and continuous learning of QA staff by means of short courses, conferences, meetups, certifications, etc. • Be an escalation point for all matters related to testing and quality assurance and operate as a primary point of contact for the QA teams. • Direct the development of the QA strategy, methodology, discipline and framework. Driving and improving the QA team in areas of automated testing and agile testing. • Provide technical expertise in Test Automation, Testing Methodologies, Testing Processes, Tools and Techniques across the teams. • Work with QA managers, Development managers and the Software Development Director to develop and execute QA strategies to meet and exceed department and corporate quality goals QA Manager ∙ Interface with onsite development Leads ∙ Interface with offshore Test Lead ∙ Gather application knowledge from Avon Business Users ∙ Ensure effective Knowledge Transition to the Offshore test team ∙ Getting clarifications from Avon Business Users ∙ Facilitate sign-off on all deliverables ∙ Provide status and escalate issues ∙ Review deliverables ∙ Set up and drive defect triage meetings ∙ Identify, collect and share all relevant metrics as agreed with Avon stakeholders QA Engineer ∙ Work with the Business Users to identify testing requirements ∙ Prepare Test cases for testing application functionalities ∙ Design Test cases ∙ Executing the manual test cases ∙ Verify and report test results ∙ Provide inputs to automation team ∙ Provide the regression test cases and test data