2. QA Center of Excellence (TCOE) is a model for
centralized testing platform that provides a platform
for standardizing test processes and optimal utilization
of resources for test purposes. It brings the
infrastructure and resources together for achieving
excellence in testing functions.
The below diagram presents the core building blocks
of a QA Centre of Excellence Framework.
1.1 Parts of Framework
1.1.1 Executive Commitments
The obligations at the executive level for the success of
QA CoE are:
▪ Budget –Overall investment required to setup and
sustain the QA CoE (cost of resources, training,
infrastructure, hardware / software licenses, audits)
has to be approved
▪ Business Drivers – Clear and concise objectives
should be set for the QA CoE to measure success
▪ Transparency – QA CoE will be successful only
when standardized and streamlined processes are
followed across various projects. Also, there should
be visibility and transparency to eliminate any kind
of latency that may delay testing activities
▪ Leadership Review – Continuous feedback and
suggestions by stakeholders and leaders is
essential for a CoE to be successful
1. HEALTHCARE QA CENTER OF EXCELLENCE (CoE) FRAMEWORK
1
QA CoE Framework
3. 2
1.1.2 Key Drivers
The following factors should be in place for the
success of the QA CoE:
▪ Empowered & Skilled Personnel
• Shared / Flexible resource pooling across
different projects
• Continuous improvement and innovation
• Build knowledge repository for cross skilling
and training
• Create a defined and structured core team with
expertise in various applications
▪ Standardized Approach
• Structured, business-driven testing approach
(TDD / TLD & BDD)
• Training and audits to ensure adherence to
standards and guidelines
• Tailor-made best practices and processes to
achieve greater ROI
• Metrics management / Dashboard reporting
mechanism
▪ High-end IT Expertise
• Create automated frameworks to ensure
reusability
• Leverage top industry tools like TFS, HPQC,
QTP, Selenium, Test Complete, Ranorex etc.
• Keep abreast with the latest trends in testing
through continuous training and R&D
▪ Streamlined Project Management
• Define KPIs / SLAs to meet organization
objectives
• QA resource management and planning
• Apply quality management principles across all
projects
• Focus on PHI compliance to ensure that
healthcare security is not compromised
4. 3
1.1.3 Key Performance Indicators
The following KPIs should be measured to track the
success of the QA CoE against set business drivers
and, based on the same, improvements should be
brought about to achieve the goals:
▪ Improve Test Effectiveness
• Define metrics to gauge the quality of the QA
deliverables
• KPIs should be defined at each stage of the
STLC
• Test plan coverage, test creation and execution
productivity, defect escapes in prod, and defect
density should be measured in every release
▪ Reduce Time to Market
• Have a shift left approach
• Integrate testing strategies in the CI / CD
pipeline
• Introduce parallel batch executions
• Improved project management
▪ Performance Benchmark and Compliance
Review
• Set up the processes for performance
benchmarking
• Identify KPIs that can be measured and
compare with the benchmark
• Setup standards for the compliance review
process
▪ Conformance of Standards and Quality
• Define and set the guidelines for standards to
be followed
• Identify areas to be measured for conformance
of quality and standards
• Define ways to measure the conformance of
standards
1.2 Details of Core Services
1.2.1 List of Core Testing Services
▪ Manual & Automation
• Smoke / BVT Testing
5. 4
• Sanity Testing
• Functional Testing
• Integration Testing
• Database Testing
• Regression Testing
• User Interface Testing
• Patient Safety Testing
• Exploratory Testing
▪ Security
▪ Performance
1.2.1 List of Core Testing Services
The following testing strategies (manual / automated)
should be implemented as part of the testing process:
▪ Smoke / BVT Testing
Test
Objective
▪ To validate the critical
functionalities of the application
are working properly
Technique
(contd.)
▪ Create the test cases to verify
that the basic functionality is
working as expected
▪ Execute the test cases and
ensure that the objective is
satisfied
Test Entry
Criteria
▪ Build is deployed successfully
on the test environment
Test
Completion
Criteria
▪ All planned smokes tests have
been executed
▪ No tests have failed in the
smoke test execution
Special
Considerations
▪ Should be performed every
time a build is deployed
▪ Required test data should be
present to test the end-to-end
functionality
6. 5
▪ Sanity Testing
▪ Functional Testing
Test Objective
▪ To validate the functionalities
of specific features being
tested are working as
expected
▪ It is specific to the new feature
being developed and tested
Technique
▪ Create the test cases to verify
the basic functionality of the
feature is working as expected
▪ Execute the test cases and
ensure that the objective is
satisfied
Test Entry
Criteria
▪ Build is deployed successfully
on the test environment
Test
Completion
Criteria
▪ All planned tests have been
executed
▪ No tests have failed in the test
execution
Special
Considerations
▪ Should be performed every
time a build is deployed
▪ Required test data should be
present to test end-to-end
functionality
Test Objective
▪ Validate how well QA
executes functions for
positive, negative, and
boundary conditions
Technique
▪ Create the test cases to verify
that the functions as specified
in user stories
▪ Execute the test cases and
ensure that the objective is
satisfied
7. 6
▪ Integration Testing
Test Entry
Criteria
▪ User story is completed by
Developer and code is pushed
to QA environment
Test
Completion
Criteria
▪ All planned tests have been
executed
▪ All identified defects have
been addressed
Special
Considerations
▪ Should be performed in the
QA environment before
signing off the user story
▪ Required test data should be
present to test the end-to-
end functionality
Test Objective
▪ Validate how well QA executes
functions for positive, negative
and boundary conditions
▪ Validate data inserted from
one application is displayed in
other
Technique
▪ Create the test cases to verify
that the functions and
integration as specified in user
stories
▪ Execute the test cases and
ensure that the objective is
satisfied
Test Entry
Criteria
▪ User story is completed by
Developer and code is pushed
to Integration environment
▪ The 3rd party systems are up
and running and the
connection between the
systems are enabled
Test
Completion
Criteria
▪ All planned tests have been
executed
▪ All identified defects have
been addressed
8. 7
▪ Database Testing
Special
Considerations
▪ The testing should be
performed in the Integration
environment, where the
environment is connected to
the other 3rd party systems
and data flow between the
applications is setup
▪ Integration testing takes place
before the release to ensure
all systems are functioning
well. It is also executed when
the user stories / functionality
depends on 3rd party systems
Test Objective
To validate whether the following
criteria are satisfied:
▪ The database access methods
function as required
▪ The integrity of the data
stored in the database is
maintained
Technique
Create scenarios to:
▪ Inspect the database to ensure
that data has been populated
as intended and all database
events have triggered
appropriately
▪ Execute the scenarios to ensure
that objective is satisfied
Test Entry
Criteria
▪ User story is completed by
developer with code and data
pushed to the test
environments and database
Test
Completion
Criteria
▪ All planned tests have been
executed
▪ All identified defects have been
addressed
Special
Considerations
▪ NA
9. 8
▪ User Interface Testing
▪ Regression Testing
Test Objective
To validate that the following
criteria are satisfied:
▪ Navigation through the
application properly reflects
business functions and
requirements
▪ Report objects and
characteristics, such as menus,
modules, filters, sizes,
positions, states, and focus
conform to client specified and
user story
Technique
▪ Create the test cases for each
report to verify proper
navigation and object states
for each application window
and object
▪ Execute the set of test case to
ensure that objective is
satisfied
Test Entry
Criteria
▪ User story is completed by
Developer and code is pushed
to test environment
Test
Completion
Criteria
▪ All planned tests have been
executed
▪ All identified defects have been
addressed
Special
Considerations
▪ NA
Test Objective
▪ To ensure that any functionality
found working perfectly before
a new release should not break
post any new release
Technique
▪ Identify a set of test cases that
need to be performed as the
first test suite (regression test
suite) on any new release
10. 9
▪ Security Testing
Technique
(contd.)
▪ When a new release is received
by the testing team, all the test
cases should be executed in
the regression test suite to
ensure the basic functionality in
the previous release is not
broken in the new release
Test Entry
Criteria
▪ A complete integrated build is
deployed on test environment
which would be released in
production
Test
Completion
Criteria
▪ All planned tests have been
executed
▪ All identified defects have been
addressed
Special
Considerations
▪ Regression testing is done
before every release
▪ Only one regression cycle if no
critical/high defect is found
Special
Consideratio
ns
(contd.)
▪ If critical/high defect is raised
then second cycle of regression
post developer fixing it
Test
Objective
▪ To identify security vulnerabilities
in the application
▪ To validate that the application is
reasonably secure from known
vulnerabilities
Technique
▪ Create a test plan with security
test scenarios
▪ Perform automated security
testing using tools
▪ Perform manual security testing
to confirm vulnerabilities
reported by the tools and areas
not covered by the tools i.e.
business logic bypass
11. 10
▪ Performance Testing
Test Entry
Criteria
▪ User story is completed by
developer and code is pushed
to test environment
▪ Required test data should be
present to test the end-to-end
functionality
Test
Completion
Criteria
▪ All planned tests have been
executed
▪ All identified security
vulnerabilities have been
addressed
Special
Considerations
▪ Should be performed every
time a there is a new change
▪ There should be a complete
code freeze in the test
environment
▪ To ensure the integrity of the
test results, there should be
other form of testing in the
environment where security
testing is conducted
Test Objective
▪ To validate that the application
meets the defined performance
SLAs
▪ To benchmark the performance
of the application under
expected user load
▪ To assess the impact of
architectural changes (new
component added,
configuration changes) on
performance
▪ To evaluate the scalability of
the application in terms of
number of users and
transactions supported
12. 11
Technique
▪ Understand the production /
expected usage to design the
workload and identify
workflows and scenarios for
performance testing
▪ Create scripts for end-to-end
user workflows
▪ Execute planned tests and
capture performance metrics
▪ Highlight any performance
issues – in terms of response
time, application / system
errors and server resource
usage
Test Entry
Criteria
▪ The workflows in scope for
performance testing is
functionally stable
▪ All information required to plan
and design the test is provided
to performance test engineer
Test
Completion
Criteria
▪ All planned tests have been
executed
▪ All issues identified have been
resolved / deferred
Special
Considerations
▪ Should be executed if the
expected load on the
application will increase
▪ If product / application is rolled
out in new region / to new
customers
▪ If architectural changes have
been made to the application
▪ If new functionalities /
workflows have been added or
modified
13. 12
▪ Patient Safety Testing
Patient safety is emphasized in healthcare through the
prevention, reduction, reporting, and analysis of
medical errors which may lead to adverse effects.
Patient safety test suite is created to ensure:
▪ Data of one user doesn’t get displayed to
another user on any system
▪ Data entered on all platforms (web application,
iOS or Android) is stored and displayed
correctly
▪ Patient related information isn’t incomplete or
displayed inaccurately in the EHR to ensure
correct and relevant information is available for
taking clinical decisions
Technique
▪ Identify critical workflows that
may risk a patient’s life
▪ Identify usability test cases to
be performed for certification
that focus on EHR functions
required by ONC
▪ Identify real life scenarios
pertaining to:
▪ Data Entry – Ensure that the
right fields have correct
values to be selected, units of
measurement are
appropriate, labels are
marked properly, easily
interpretable, and are as per
standards
▪ Authorization – The correct
roles and access are set to
ensure wrong access to
information is not provided
Test Objective
▪ To test workflows which will test
usability and patient safety
aspects of the application which
can impact / risk the life of a
patient
14. 13
Technique
(contd.)
▪ Alerting – Proper alerts to be
generated in case any important
notification needs to be passed
to the clinician, pop-ups aren’t
blocked, alerts are self-
explanatory and simple
▪ Interoperability – Data should
flow correctly from different
systems or other modules within
the same system and related
standards are maintained
▪ Visual display – The labels and
fonts of the fields should be
clearly visible for the clinicians
to select appropriate values
▪ Automated settings – The
automated settings should be
properly validated to ensure
that dates, doze, and other
values are tested to be the right
values
Test Entry
Criteria
▪ A complete integrated build is
deployed on test environment
which would be released in
production
Test
Completion
Criteria
▪ All planned tests have been
executed
▪ All identified defects have been
addressed
Special
Considerations
▪ Patient safety testing is done
before every release on priority
▪ The release will be stopped in
case any of the patient safety
test cases fail
▪ Member who creates the
patient safety test suite should
have proper domain and
product knowledge
15. 14
▪ Exploratory Testing
Test Objective
▪ The main purpose is to identify
bugs (as against scripted
testing which validates the
application against
requirements)
▪ To test the application without
any specific test cases, and
check if design and execution
are simultaneously executed
Technique
▪ Define the scope of exploratory
testing and the time limit for
the same (90 minutes)
▪ Note defects from previous
releases and categorize them
to identify areas for exploratory
testing
▪ Identify use cases which imitate
real-life users
Technique
(contd.)
▪ Test the application using any
of the following techniques
▪ Freestyle exploratory
testing – test the application
in an ad hoc manner
▪ Scenario based testing –
Testing performed is based
on scenarios provided by
customers or prepared by
the test team
▪ Strategy based testing –
Common testing techniques
like Decision Table based
testing, Cause-Effect
graphing, and Error Guessing
are combined with
exploratory testing
▪ Document areas covered to
understand test coverage
▪ Document defects found to
enhance exploratory test plan
16. 15
Test Entry
Criteria
▪ A complete integrated build is
deployed on test environment
which would be released in
production
▪ List of defects from previous
releases are provided by the
production support team
Test
Completion
Criteria
▪ Time slotted for exploratory
testing is over
▪ Major areas of the application
are covered as part of
exploratory testing
Special
Considerations
▪ Success of exploratory testing
depends on the knowledge of
the tester with respect to the
application and the domain
▪ Documentation is necessary to
be done along with the testing
▪ Exploratory testing can not be
automated
22. 21
1.2.8 Test Data Management
Test data management is the creation of non-
production datasets that reliably mimic an
organization’s actual data so that systems and
applications developers can perform rigorous and
valid systems tests. It helps organizations create better
quality software that will perform reliably on
deployment. It prevents bug fixes and rollbacks and
overall creates a more cost-efficient software
deployment process. It also lowers the organization’s
compliance and security risks.
Factors to consider for Test Data Management
1. Test Data Sources
▪ Manually create test datasets using the
application workflows, e.g., patient creation,
add allergies and other details for the patient,
claims creation and other workflows for these
claims, etc.
▪ Automate the test data creation by automating
the workflow (mentioned above), which can be
used repeatedly to create numerous records
▪ Back-end insertion of the test data into the DB
directly by running SQL scripts or generating
bulk HL7 messages / EDI files, etc.
▪ Running batch files manually to copy data from
production, de-identify it and then dump into
the test environment
▪ Use of tools to sync de-identified production
data to the test environments
▪ Mock test data by creating stubs or using
automated tools e.g.: Mountebank
2. PHI Data Masking
▪ There are various ways in which test data can be
masked. Especially when dealing with PHI data,
its important to mask the PII and PHI data. The
following fields needs to be masked:
▪ Individual’s past, present, or future physical
or mental health or condition
23. 22
▪ Provision of health care to the individual
▪ Past, present, or future payment for the
provision of health care to the individual, and
that identifies the individual or for which there
is a reasonable basis to believe can be used to
identify the individual.
▪ Protected health information includes many
common identifiers (e.g., name, address, birth
date, Social Security Number) when they can be
associated with the health information listed.
▪ There are various ways in which data can be
masked, few of which are:
▪ Encryption – Data will be encrypted using
encryption key and only someone who has the
decryption key can decode the data
▪ Substitution method – Replace with other
meaningful data which is like an alias to the
value being masked. E.g.: Replace SSN number
with a random number by using random
number generators
▪ Redaction – Replace the data with a generic
value. E.g. all credit card numbers can be
replaced with 1111-2222-3333-4444
▪ Date Aging – All dates can be set to a policy of
dating it back by 6 months or 2 years or as
applicable. E.g. DOB of males aged more than
50 should be reduced by a year
3. Test Data Cleanup
▪ Manually delete the test data entries created
using the appropriate workflows. E.g.:
registered patient should be discharged,
entered claims should be denied or approved,
etc.
▪ Run automated scripts to change the status of
the test data to the required end state such that
it’s not visible as active
▪ Recreate the DB schema completely (delete and
create new DB schema)
24. 23
▪ Run SQL scripts or trigger HL7 / EDI files to
change the status of the test data entries
4. Test Data Backup
Once the test data is created and the application is
ready to be used, create a dump (backup) of the DB.
Next time when the build is deployed, run the restore
scripts to load the test data into the application.
5. Test Data Maintenance
Test data should be refreshed periodically. The
periodicity should be defined keeping in mind:
▪ If the period is too less, the testers will find it
difficult to recreate their specific test data which
is not created through automated scripts or
bulk file uploads
▪ If the period is too high, the test data might not
be accurate as per the latest functionality and
testers might miss testing the real edge cases
due to improper/ invalid test data
▪ Test data refresh exercise should keep in mind:
▪ Changes in existing DB schema, data
constraints, rules, etc.
▪ Addition of new database tables or fields
▪ Changes in functionality
▪ Changes in DB schema or data constrains of
the 3rd part systems which might impact the
applications functionality
25. 24
Environment Testing Strategy Test Data Source Test Data Clean-up
Test Data
Maintenance
Dev
▪ Unit testing
▪ In-Sprint scenario
testing
▪ Manual creation
▪ Automated scripts/
SQL scripts
▪ Recreate DB schema ▪ Refreshed when
there is change in
DB schema,
constraint, API
contracts
QA
▪ In-sprint testing
▪ Functional testing
▪ Automation testing
▪ Automated scripts/
SQL scripts/ Bulk
HL7/ EDI files
▪ Copy, de-identify
and dump from
Prod database
▪ Run scripts to clear
the test data i.e.
change status of
records
▪ Recreate the DB
schema
▪ Periodic refresh
(decided based on
the release and
changes planned)
▪ Refreshed when
there is a change in
DB schema,
constraints, API, etc.
▪ Test Data and Environment Strategy
26. 25
Environment Testing Strategy Test Data Source Test Data Clean-up Test Data Maintenance
Integration
▪ End-to-end
testing
▪ Integration
testing
▪ Release testing
▪ Automated scripts
/ SQL scripts / Bulk
HL7 / EDI files
▪ Copy, de-identify
and dump from
Prod database
▪ Ensure connectivity
and data flow
between 3rd party
systems
▪ Real data or mock
the test data from
3rd party systems
▪ Run scripts to
clear the test data
i.e. change status
of records
▪ Recreate the DB
schema
▪ Periodic refresh
(decided based on
release & changes
planned)
▪ Refreshed when there
is a change in DB
schema, constraints,
API, etc.
UAT
▪ User acceptance
testing
▪ Integration
testing
▪ Copy, de-identify
and dump from
Prod database
▪ Recreate the DB
schema
▪ Real data should
flow from 3rd
party systems
▪ Refreshed every new
release
▪ Periodic refresh
(decided based on
release & changes
planned)
▪ Refreshed when there
are changes in DB
schema, constraints,
API, etc.
27. 26
1.2.9 CoE Governance
1.2.9.1 Project Framework
Client/ vendor should use a project management
process that includes a formal set of tools and
techniques to initiate, plan, execute, monitor, control,
and close projects. It should include:
▪ A clear project framework for achieving project
specific goals and business goals
▪ Emphasis on phased execution (i.e., regular and
measurable progress)
▪ A systematic approach to resolving high-risk
factors associated with an objective
Throughout the implementation, the client team will
be equipped with the correct tools and resources to
assist with strategic decisions, and client/ vendor
should work to identify critical use cases that will
support their vision and objectives.
The QA strategy should be tailored to meet the client’s
preferences and accommodate varying levels of
technical capabilities.
A phased implementation approach is recommended,
keeping in mind the following objectives for ensuring
sustainable success of your initiative:
▪ Ensure rapid go-live
▪ Minimize risk and demonstrate value early
▪ Ensure effective adoption
▪ Interactively evolve the solution with user feedback
▪ Realize quick and sustained ROI
1.2.9.2 Roles and Responsibilities
CitiusTech
Role
Key Responsibilities
Healthcare QA
Manager
▪ Project planning and
coordination
▪ Team management and
technical leadership
▪ Weekly status reporting
covering progress, plan, issues
and risks
28. 27
1.2.9.3 Project Lifecycle
Following is a schematic representation of the Agile
(SCRUM) methodology for project execution:
CitiusTech
Role
Key Responsibilities
Healthcare QA
Lead
▪ Client coordination
▪ Sprint planning, backlog &
retrospective
▪ Daily standup and sprint
velocity
Healthcare QA
Engineer
▪ Create test plan & test cases
for functional and non-
functional requirements
▪ Test data setup & test case
execution for functional,
system integration &
performance testing
▪ Defect tracking and re-testing
Security Testing
Specialist
▪ Information security and
privacy requirements
▪ Security testing scenarios, test
cases, test execution, test
result reporting
CitiusTech
Role
Key Responsibilities
Security Testing
Specialist
(contd.)
▪ VAPT and support regulatory /
industry certification efforts
Performance
Specialist
▪ Performance SLA review /
discussions
▪ Performance testing scenarios,
test cases, test execution, test
result reporting
▪ Support certification efforts
31. 30
1.3.1 Execution process for each application
within CoE
1.3.1.1 Requirement Analysis
Sr.
No.
Activity
Owner &
Participants
Templates
Tools /
Technology
1
PO / BA to conduct refinement
session with the QA Team before the
sprint starts
PO / BA / SM or
Project Manager
NA Meeting
2
PO to document all the user stories
with acceptance criteria before the
sprint starts
PO NA Agile Tool
3
PO to document the Impacted areas
in the User Story (in case of legacy
systems)
PO NA Agile Tool
4
PO to attach detailed requirement
documents(mockups / wireframes) if
any to the user stories
PO NA Agile Tool
32. 31
1.3.1.2 Sprint Planning / Release Planning
Sr.
No.
Activity
Owner &
Participants
Templates*
Tools /
Technology
1
Only prioritized user stories with clear
requirements should be selected for
Sprint Planning
Scrum Team NA NA
2
During planning, QA estimates (story
points) should be considered
QA Team / SM
▪ CT Estimation
Guidelines
▪ CT Template
Sprint Tracker
Agile Tool, etc.
3
The team should be clear on
Definition of Done for QA tasks
QA Team Refer Appendix 3.1 Agile Tool, etc.
4
User story state should be correctly
maintained
Scrum Team Refer Appendix 3.2 Agile Tool, etc.
5
Create the QA project plan to define
overall QA processes
QA Lead CT PMP - QA Project
Word / Excel
Document, etc.
6
Form team based on requirements &
estimates (skill sets, experience, etc.)
Project Manager Refer Appendix 3.3 NA
* Templates mentioned are created as part of SmartTest and SmartAutomation processes followed at CitiusTech.
These work as accelerators which can be readily used for any project with appropriate customization.
33. 32
1.3.1.3 Test Design (Manual)
Sr.
No.
Activity
Owner &
Participants
Templates
Tools /
Technology
1
Create the Test Approach and Strategy
document for the project
QA Lead
CT Test Approach
and Strategy
PowerPoint, etc.
2
Post requirements grooming session,
QA team to brainstorm and create
scenarios
QA Team
CT Template Test
Scenario
Excel, etc.
3 Manual Testing effort estimation QA Team
CT Template Test
Estimation-Manual
Excel / Agile Tool,
etc.
4 Review of scenarios by BAs and Dev's QA / BA / Dev CT Review Tracker Excel, etc.
5
QA Team to create test cases based
on scenarios
QA
CT Template Test
Suite
TCMS – TFS, ALM,
Zephyr, etc.
6 Peer and Lead review of the test cases QA
CT Template Test,
Case Review
Checklist
Excel, etc.
7 Define the KPIs and the SLAs QA Lead Refer Appendix 3.4 Excel, etc.
34. 33
1.3.1.4 Test Design (Automation)
Sr.
No.
Activity
Owner &
Participants
Templates
Tools /
Technology
1
Create the Automation Strategy
Document
Automation Lead
CT Template Automation
Strategy
Document
2
Estimate automation efforts
based on requirements gathered
Automation Lead /
Team
CT Template Test Estimation-
Automation
Estimation
Document
3
Design the automation
framework for the project
Automation Lead /
Team
Automation Testing Process
and Best Practices;
Automation Frameworks
Automation
Code in the Tool
4
Automate the test cases and
create automation scripts
Automation Team
Create coding guidelines
Sample: CT VBScript
Automation Coding
Guidelines
Automation
Scripts in the Tool
5 Track the automation status Automation Lead
CT Template Automation
Tracker
Automation
Tracker
Document
35. 34
1.3.1.5 QA Environments and Build Strategy
▪ Specific user story testing for the release R1 done
during sprint cycles of R1
▪ Post the sprint testing, team gives sign off on the
sprints – on CI and QA Environment
▪ Regression testing of the overall stories for the
release R1 (across sprints) is pending
▪ Integrated build for the release is deployed in the
Integrated Environment, which is like a Pre-Prod
environment
▪ Regression testing of the release R1 is done in the
sprints cycle of next release i.e. R2 along with the
user stories for that release R2
▪ After that R1 is released into production
▪ Regression testing of R2 stories is executed in the
R3 release sprint cycles
37. 36
1.3.1.6 Test Execution (Manual)
Sr.
No.
Activity
Owner &
Participants
Templates
Tools /
Technology
1 Execute the test scenarios QA Team
CT Template Test Execution
Result
Zephyr / TFS / QC
etc.
2
Log defects for issues and failed
test cases
QA Team
CT Template Test Defect
Report
Jira / TFS / QC
etc.
3
Defect lifecycle – Triage the
defects and assign to the
developers
PO / QA Team Refer Appendix 3.5
4
Sent the test execution status
reports to the stakeholders
QA
▪ CT Template Test
Execution Result
▪ CT Template Test Status
Report
Jira / TFS / QC
dashboards, etc.
5
Provide QA Sign-off on the final
build for a sprint or a release
QA CT Template QA Sign Off
Word document,
etc.
6
Conduct RCA of the defects
escaped in production
QA Team CT Template Defect Analysis
Excel document,
etc.
38. 37
1.3.1.6 Test Execution (Automation)
Sr.
No.
Activity
Owner &
Participants
Templates
Tools /
Technology
1
Execute the automation scripts in
batches and send daily reports
Automation Team
▪ CT Template Batch Detail
and Summary Report
▪ CT Template Batch
Summary Report
Automation Tool
/ TCMS
(Automation
Anywhere,
Katalon,
Selenium etc.)
39. 38
1.3.2 Suggested Flow
Team Structure
▪ Formal Agile team structure should be formed
▪ Roles and Responsibilities for scrum team should
be defined
▪ QA members (manual + automation) to be part of
the scrum team
PO / BA
▪ PO should create product backlog
▪ PO signoff for user stories should be obtained
before testing starts
Dev
▪ Dev team should share build release notes and
provide unit test results
▪ Dev team should provide impact analysis of the
fixes / changes made
QA
▪ Each team member should attend sprint planning
meeting and create sprint backlog
▪ Test strategy and test plan should be created
▪ Estimation template should be used for user stories
for sprint planning
▪ QA team should create manual test cases in the
TCMS
▪ Peer review of manual cases and signoff from BA
should be taken
▪ QA team should execute automated BVT (Build
Verification Test) after each build
▪ Jira should be used to track Requirement
traceability
▪ Smoke and regression suites should be identified
▪ QA team should identify business critical scenarios
and add to regression suite
▪ Automate regression cases (API Web, Mobile)
▪ Automation and manual test execution report
should be shared with the team
40. 39
2.1 Testing Effectiveness
▪ Defect Escapes
• With reduced defect escapes through RCA and
defect prevention, client can achieve Improved
Quality
Defect escape = # of Client reported defects * 100
----------------------------------------------------------
(# of QA defects - # of QA defects rejected) + (# of
Client reported defects)
▪ Test Cycle Execution Time
• Reduction in the execution cycle through
automation and multiple environments, client
can achieve Faster time to Market
• Test execution time = Total time taken to run
the test cases (manual and automation) for a
build or release
• Test Coverage
• Testing coverage will increase due to
automation which will help achieve Improved
Quality
Test Coverage = Total no of requirements tested
----------------------------------------------------------
Total no. of requirements released
2.2 Tools
▪ Hardware/ software license usage
• With optimal usage of hardware and software,
client can achieve improved quality as well as
Faster time to Market)
• Use of software license metrics is recommended
here (Named User / Role, Concurrent User /
Instance, Instance, Managed Capacity,
Performance, Physical Machine, Machine
Compute Capacity)
▪ Environment Usage
• Due to managed build deployment and testing
strategy, client can achieve Improved quality as
right environment will be used for the right kind
of testing
2. KPIS TO ACHIEVE BUSINESS BENEFITS
41. 40
• Also multiple environments will ensure
simultaneous testing being conducted and thus
facilitate faster time to market
• Environment usage matrix will help define
which environment should be used for which
testing and what stages
2.3 People
▪ Resource Skill Index
• With the skill set of the resources improving
through cross-skilling and documentation, the
client can achieve Faster time to Market, as well
as Improved Quality
• Sample Skill Matrix
Weight Descriptor:
1 – Needs training
2 – Currently being trained
3 – Trained & has knowledge to test independently
4 – Trained & can train others
• Based on the total score, we know which
module / tool / domain is least known and
needs more attention from a training
perspective
• We can also set threshold that if the score of an
area is less than that, it needs attention
▪ Resource Utilization
• With proper usage of resources across projects,
the client can achieve Faster time to Market
• Based on the skill matrix and availability of the
members, the resources can be utilized for
different projects to ensure resources are
leveraged optimally
Team
Mem
Mod
1
Mod
2
Tool
1
Tool
2
Dom
1
Dom
2
Mem 1 1 3 2 3 2 3
Mem 2 4 3 2 3 3 4
Total 5 6 4 6 5 7
42. 41
3.1 Template for Tasks for User Stories
The following QA tasks should be created for a User
Story:
▪ Scenario brainstorming
▪ Test scenario creation
▪ Test scenario review with peers
▪ Test scenario review with BA and Dev
▪ Test case creation
▪ Test case review with peers
▪ Test case review with lead
▪ Test execution
▪ Regression testing
▪ Defect re-testing
▪ Test status reporting
3.2 Template for Tasks for User Stories
The states of a user story should be either:
▪ New: The user story should be in New State when it
is added in the Agile Tool
▪ ReadyForRefinements: The user story should be in
this state when the Business Team / BA is clear on
the requirements
▪ ReadyForSprint: In this state when the team knows
the requirements and is ready for Sprint Planning
▪ ReadyForDev: The user story should be in
ReadyForDev state when it is assigned to a sprint
▪ InDevelopment: The user story should be in
InDevelopment state when the Developer has
picked the user story for Development
▪ ReadyForIntegration: The user story should be in
ReadyForIntegration state when the user story is
ready for deployment on the QA environment
▪ InIntegration: The user story should be in
InIntegration state when it is deployed to the QA
environment and ready for the QA team to test
▪ ReadyForRelease: The user story should be in this
state when the QA team has finished their testing
and it is ready to be deployed to Production
▪ Closed: The user story should be in Closed state
after it is deployed to Production
3. APPENDIX
44. 43
3.4 KPIs
Projects – Agile Methodology
Metrics Name
Unit of
Measure
Metrics
Parameters
Formulae Remarks
Productivity Metrics
Done to Said Ratio Ratio
# of Story points
delivered;
# of Story points
planned
# of Story points
delivered / # of Story
points planned per
sprint
Indicator of productivity of
the scrum team
Sprint Velocity
Story points per
team
# of Story points
delivered
# of Story points
delivered
This is a measure of how
much functionality the team
can deliver. Indicator of
productivity
Sprint Velocity per
Person
Story points per
person
# of Story points
delivered;
# of members in
the team
# of Story points
delivered/ # of
members in the team
This is a measure of how
much functionality the team
can deliver per person per
sprint. Indicator of
productivity
45. 44
Projects – Agile Methodology
Metrics Name
Unit of
Measure
Metrics
Parameters
Formulae Remarks
Productivity Metrics
Manual Test Case
Preparation
Productivity
Manual Test
Cases prepared
per person day
# of test cases
created;
Total test case
creation effort
(# of test cases created/
Total test case creation
effort in PDs)
This is an indicator of
test case writing
productivity
Automation Test
Case Preparation
Productivity
Automation
Test Cases
prepared per
person day
# of Automation
test cases created;
Total test case
creation effort
(# of Automation test
cases created/ Total
Automation test case
creation effort in PDs)
This is an indicator of
test case writing
productivity
Manual Test Case
Execution
Productivity
Manual Test
Cases executed
per person day
# of test cases
executed;
Total test
execution effort in
PDs
(# of test cases executed/
total test execution effort
in PDs)
This is an indicator of
test execution
productivity
Rate of Test
Automation
%
# of Test steps to
be automated;
# of Test steps
automated;
(# of Test steps
automated/ # of Test steps
to be automated) *100
This is an indicator of
the rate of automation
of the team
46. 45
Projects – Agile Methodology
Metrics Name
Unit of
Measure
Metrics
Parameters
Formulae Remarks
Quality Metrics
Defect Density
(Size in Story Point)
Defects per
story point
# of QA defects;
# of Client
reported defects;
# of QA defects
rejected;
Total size in Story
point
(# of QA defects + # of
Client reported defects - #
of QA defects rejected)
/ Total size in Story point
This is an indicator of
the quality of the
deliverable. Defects per
story point for a release
Defect Leakage to
Client
%
% # of QA Defects;
# of QA defects
rejected;
# of Client
reported defects;
(# of Client reported
defects / ((# of QA defects
- # of QA defects rejected)
+ # of Client reported
defects))*100
This is a measure of the
effectiveness of test
execution
Manual Test Case
Execution
%
# of test cases
planned to
execute; # of test
cases executed
(# of test cases executed/
# of test cases planned to
execute )*100
This is an indicator of
the test execution
coverage
47. 46
Projects – Non-Agile Methodology
Metrics Name
Unit of
Measure
Metrics
Parameters
Formulae Remarks
Productivity Metrics
Delivery
Commitment
%
# of deliveries
delivered; # of
deliveries planned
# of deliveries delivered/#
of deliveries planned per
month or release
An indicator of
commitment to the
planned deliveries
Manual Test Case
Preparation
Productivity
Manual Test
Cases prepared
per person day
# of test cases
created; Total test
case creation
effort
(# of test cases created/
Total test case creation
effort in PDs)
This is an indicator of
test case writing
productivity
Automation Test
Case Preparation
Productivity
Automation
Test Cases
prepared per
person day
# of Automation
test cases created;
Total test case
creation effort
(# of Automation test
cases created/ Total
Automation test case
creation effort in PDs)
This is an indicator of
test case writing
productivity
48. 47
Projects – Non-Agile Methodology
Metrics Name
Unit of
Measure
Metrics
Parameters
Formulae Remarks
Productivity Metrics
Manual Test Case
Execution
Productivity
Manual Test
Cases executed
per person day
# of test cases
executed; Total test
execution effort in
PDs
(# of test cases executed/
total test execution effort
in PDs)
This is an indicator of
test execution
productivity
Rate of Test
Automation
%
# of Test steps to
be automated; # of
Test steps
automated;
(# of Test steps
automated/ # of Test
steps to be automated)
*10
This is an indicator of
the rate of automation
of the team
49. 48
Projects – Non-Agile Methodology
Metrics Name
Unit of
Measure
Metrics
Parameters
Formulae Remarks
Quality Metrics
Defect Density with
respect to Size in
Story Point
Defects per
story point
# of QA defects;
# of Client
reported defects;
# of QA defects
rejected;
Total size in Story
point
(# of QA defects + # of
Client reported defects -
# of QA defects rejected)
/ Total size in Story point
This is an indicator of
the quality of the
deliverable. Defects per
story point for a release
Defect Leakage to
Client
%
% # of QA Defects;
# of QA defects
rejected;
# of Client
reported defects;
(# of Client reported
defects/ ((# of QA defects
- # of QA defects
rejected) + # of Client
reported defects))*100
This is a measure of the
effectiveness of test
execution
Manual Test Case
Execution
%
# of test cases
planned to
execute;
# of test cases
executed
(# of test cases executed/
# of test cases planned
to execute )*100
This is an indicator of
the test execution
coverage
51. 50
3.6 User Story Guidelines
User stories should follow the INVEST principle
▪ Immediately Actionable
• Can be delivered independently?
• Free from external blockage?
▪ Negotiable
• Descriptive enough to support team debate
and conversation?
▪ Valuable
• Delivers customer or business visible benefit?
▪ Estimable
• Clear enough that team can estimate?
▪ Sized to Fit
• Divided into small enough blocks to complete
within Sprint?
▪ Testable
• Clear acceptance criteria to know when it is
“good enough?”
Ways to prioritize the user story (MOSCoW)
▪ M: Must. Describes a requirement that must be
satisfied in the final solution for the solution to be
considered a success
▪ S: Should. Represents a high-priority item that
should be included in the solution if it is possible.
This is often a critical requirement but one which
can be satisfied in other ways if absolutely
necessary
▪ C: Could. Describes a requirement which is
considered desirable but not necessary. This will be
included if time and resources permit
▪ W: Won’t. Represents a requirement that
stakeholders have agreed will not be implemented
in a given release but may be considered in the
future
3.7 Acceptance Criteria Guidelines
What is an acceptance criteria?
▪ Acceptance criteria define the boundaries of a user
story, and are used to confirm when a story is
completed and working as intended
▪ Acceptance criteria are a set of statements, each
with a clear pass / fail result, that specify both
functional and non-functional requirements, and
are applicable at the epic, feature, and story level
52. 51
▪ What are Acceptance Criteria used for?
• To define boundaries
• To reach consensus
• To serve as a basis for tests
▪ Acceptance criteria are usually initiated by Product
Owner or BA but other team members can also
participate in defining the acceptance criteria for
each story
▪ Example for Password for a Login Module:
• Must be at least 8 characters and no more than
12
• Must contain only alpha numeric
• Must contain at least one digit
• Must contain at least one character
• Etc. (There would be more such criteria)
How to write Acceptance Criteria?
▪ Rules oriented (form of a list) or Scenario oriented
(form of scenarios to explain each criterion)
▪ Common form of writing Acceptance Criteria is the
GWT (Give/ When/ Then) format derived from BDD
framework
• E.g. 1: User story for Search a Patient
Given the physician is logged in the system
And is on the worklist screen
When the physician enters the name of the
patient
And clicks Search button
Then the patients matching the search criteria
are shown up on the Worklist
• E.g. 2: User story for validating mandatory fields
for case creation
Given the nurse is logged in the system
And is on the <case creation> screen
When the nurse enters the CPT, ICD, Physician
fields
Then the Create Case button is enabled
Given the nurse is logged in the system
And is on the <case creation> screen
When the nurse fails to enters any of the
following – CPT or ICD or Physician fields
Then the Create Case button is disabled.
54. 53
ABOUT THE AUTHOR
Prithu George
Assistant Vice President – Enterprise Application Proficiency, CitiusTech
prithu.george@citiustech.com
Prithu George comes with 15 years of experience in project delivery management with hands-on expertise in the
healthcare domain that includes HL7 v2.6, EHR / EMR applications. She has worked with leading healthcare
organizations to develop their automation framework, implement automation best practices as well as set up a
whole healthcare QA Centre of Excellence.
At CitiusTech, Prithu leads the QA Practice which involves creating, reviewing and enhancing software testing
related processes for clients and in-house projects. She is certified in HL7 v2.6, CSQA, CSM and holds a Bachelor’s
degree in Information Technology.