1. The document discusses best practices for scaling functional testing in an agile environment, including evaluating your current testing strategy, defining a testing process, training your team, implementing testing into your continuous integration pipeline, and continuously improving the process.
2. It provides examples of defining a testing process across development, integration, staging, and production environments and implementing notifications and quality gates at each step of the pipeline.
3. The key is finding the right balance of exploratory, manual, and automated testing as part of a well-defined, continuously improving strategy baked into the deployment pipeline.
4. 300 Thousand
community members
2.4 Million
devices
200+
countries and territories
1+ Million
vetted submissions/year
• Access to skilled and highly vetted global talent
• Real people, real environments
• Team engaged within hours on demand
• Curated based on required skills, demographics, locations, devices
• Engaged through gamification and meritocracy
Largest Global Community Of Software Testers
4
6. PAYMENT
TESTING
Ensure successful and accurate
payments across the globe
OMNICHANNEL
FEEDBACK & TESTING
Provide intuitive and engaging
experiences for your customers
6
DIGITAL
TESTING
Deliver experiences that work
every time for everyone
APPLAUSE PLATFORM
Manual Functional Testing
Automated Functional Testing
Accessibility Assessments
Digital Customer Journeys
In-Field Customer Journeys
Usability Studies
Transaction Validation
Digital Wallet Testing
Applause Solutions
Security Testing
7. Pillars Of Applause Functional Testing
7
ExploratoryTesting StructuredTesting Test Automation
Goal Find unknown defects “In-the-Wild” Prove key digital paths work as built
Increase speed and reduce cost of high
volume, repeatable structured testing
How We Do It
10-20 testers are given general or
targeted scope to find defects
Dedicated hours to write, maintain and
have testers follow scripted test cases
White-glove managed service
+ApplauseAutomation Framework
+Expert automation engineers
Output
Detailed bug reports with reproducible
steps, pictures and videos of bugs
Pass/Fail reporting across manually run
structured testing
Pass/Fail analysis across automated
structured testing
+Automation dashboard for trending and
deeper analysis
8. Applause Continuous Testing Through Manual & Automation
Applause TCOE
Customer’s Development Team
Applause ITW Manual Testing
Exploratory Testing & Test Case Execution
Applause Automation
Java Appium & Selenium
SDLC
Cl Server
(more details in Automation
section)
Unit
Testin
g
Smoke
Testin
g
Code released from
development already passed
automated regression testing
Developer notified if
new code breaks
automated tests
10. Challenges for Agile Testing
10
Developing A Strategic
Testing Strategy
Each company and application is
different.There is no silver-bullet
approach.
Implementing A
Deployment Pipeline
A testing strategy must be
properly baked into a deployment
pipeline to minimize friction.
Maintenance At Scale
Scaling functional testing
efficiently means doing so with a
strategy in mind.
Continuous Improvement
Integrating continuous feedback,
analytics and reporting into the
testing process spans well beyond
automating a test.
12. TEAM
What is the team makeup?
Where are the expertise gaps?
Are QA and Dev teams working in silos?
TECHNOLOGY
What does the deployment pipeline look like?
How are automated tests triaged?
How is automation integrated with a
VSC/TCM/BTS?
PROCESS
When are tests written?
How are bugs triaged and tests updated?
How fast are sprint cycles?
How does feedback guide test strategy
Self
Evaluation
12
REPORTING
What is the test coverage?
What are the common devices used?
How are test results viewed?
How is a “go/no-go” decision made?
What is the automation ROI?
PAIN
Existing pain points?
What bugs have been missed?
13. 13
The Functional Testing Maturity Model
REGRESSIVE REPEATABLE CONSISTENT QUANTITATIVE OPTIMIZING
Maturity, Release Velocity, Value
Automation
Manual Structured
Testing
Exploratory
ScopeofTesting
Planned Critical Path
Measured ROI
Real-Time Feedback
14. 1. Evaluating your maturity and mapping out the journey is key in
mastering an effective strategy
2. Apply the right blend of exploratory, manual and automated
testing as part of your testing strategy
3. Implement your strategies in a continuous and procedural
fashion
Maturity
Assessment
14
16. Beginner: Functional Testing Across Customer SDLC
16
DEVELOPMENT TEAM UNIT TESTS AUTOMATED TESTS MANUAL TESTS ADDITIONAL TESTS/RELEASE
Check In Smoke Tests Regression Deploy
Check In
Feedback
Smoke Tests Regression
Check In
Feedback
Smoke Tests
Check In
Feedback
17. Check In Smoke Tests Regression Deploy
Intermediate: Functional Testing Across Customer SDLC
17
DEVELOPMENT TEAM UNIT TESTS AUTOMATED TESTS MANUAL TESTS ADDITIONAL TESTS/RELEASE
Check In Smoke Tests Regression Deploy
Integration Smoke
Tests
UI Smoke Tests
Explorator
y
Automated Regression
Manual Regression
18. Advanced: Functional Testing Across Customer SDLC
18
Check In Deploy
INT
Deploy
QA
Deploy
STAGE
DEVELOPMENT TEAM UNIT TESTS FUNCTIONAL TESTING ACCEPTANCE
Check In Smoke Tests Regression Deploy
QUALITY GATES
Continuous Feedback, Reporting & Analytics
Unit
Test
Automated
Smoke Test
Automated Regression
Time Boxed Exploratory
Manual Regression
Extended Automation
Broad Ex Testing
Manual Acceptance
20. Scaling testing while managing risk and cost can be broken down
into four key areas.
1. Defining the process
2. Training your team
3. Implementing into CI pipeline
4. Improving the process
Key
Components
20
21. Defining the
Process
1. Self Evaluate: Who do you have on your team and what are their
skills? What processes do you have in place today?
2. Set a goal for the amount of automation/exploratory testing
needed in your process and determine where you would like to
be in your CI progress.
3. Define what the “gates” or “handshakes” are based on the
teams that you have.
21
22. Client Example - Hotfix Training
22
DEVELOPMENT TEAM INTEGRATION STAGING PRODUCTION
Check In
Feedback to Dev
Unit
Test
Manual Hotfix
Validation
Automated
Smoke Test
Automated
Smoke Test on
Production
Data
Automated
Smoke Test on
Production
Data
Push to Stage Push to Production
23. 1. Based on your process, who needs to be trained?
2. At each handoff, your team will need to know the appropriate
information to pass on.
3. Align the team on common reporting and metrics.
Training Your
Team
23
24. Client Example - Hotfix Training
24
DEVELOPMENT TEAM INTEGRATION STAGING PRODUCTION
Feedback to Dev
Unit
Test
Manual Hotfix
Validation
Automated
Smoke Test
Automated
Smoke Test on
Production
Data
Automated
Smoke Test on
Production
Data
Train Developers or
Release Managers
Train Manual
Team to test and
give feedback
Train Automation
Team to triage
and give feedback
Train QA Stakeholder to interpret
test results
Train QA Stakeholder to interpret
test results
Train Dev/Product to triage bugs
coming back and iterate quickly
26. Client Example: Implementing into CI Pipeline
26
DEVELOPMENT TEAM INTEGRATION STAGING PRODUCTION
Build Uploaded Testing Started Testing Ended Bugs Logged
Check In
Unit
Test
Manual Hotfix
Validation
Automated
Smoke Test
Automated
Smoke Test on
Production
Data
Automated
Smoke Test
Push to
Staging
Push to Production
Feedback to Development
Ready to Push
27. Improving the
Process
27
1. Find ways to make your process more efficient by streamlining
timing
2. Automate repetitive tasks
3. Implement a CI pipeline
4. Invest in quality and consistent reporting
28. Client Example - Hotfix Training
28
DEVELOPMENT TEAM INTEGRATION STAGING PRODUCTION
Check In
Feedback to Dev
Unit
Test
Automated
Smoke Test on
Production
Data
Automated
Smoke Test on
Production
Data
Push to Stage Push to Production
Manual Hotfix
Validation
Automated
Smoke Test
Manual Hotfix Validation
Automated Smoke Test
Hi! Welcome to the webinar.
Today we're going to....
Drew
Applause is the leader in digital quality and crowdtesting.
Our community-driven approach incorporates real people and insights into every phase of your SDLC – letting you innovate faster and deliver experiences that truly resonate with your customers.
Drew
Our community is the largest and most accomplished in the world. Wherever your customers are located, whatever language they speak, and whatever device they use – we cover it within the community.
The diversity of our community members helps you augment or fill-in expertise gaps within your company as well as provide accurate end-user perspectives.
All of our community members are vetted and rated based on their expertise -- manual testing, automation, security, usability, accessibility and more -- and are paid when they deliver results.
Drew
Over the last 10 years, we have had the pleasure to work with and learn from the thousands upon thousands of the most innovative, brand conscious organizations as they travel down their own digital paths. Along the way, we have tested thousands of applications and digital experiences over a wide range of vertical industries, and helped our clients identify millions upon millions of defects before they ever reach their customer end users.
Cathy
Applause has a vast portfolio of full-service testing and feedback, including digital testing, omnichannel, and payment testing.
Digital Testing : including manual and automated functional testing and accessibility assessments.
Our Omni channel feedback testing focuses on Digital Customer Journeys, In field customer journeys and usability studies.
And finally, our Payment Testing, which helps our customers ensure successful and accurate payments globally, includes transaction validation and digital wallet testing.
Cathy
As we will mention, the Functional Test Maturity Model requires proper execution of the three pillars of functional testing, including Manual Exploratory, Manual Structured, and Automated Testing.
Going through each, first we have Manual Exploratory, which aims to find unknown defects “in the wild”, and is done by having testers follow a general targeted scope and to use the application as normal end-users would. And recording detailed bug reports when unexpected conditions and errant behavior is found.
Next we have Structured Testing, also known as Functional Regression Testing. The primary intent with this testing is to validate that the key digital paths of the application under test work as they were designed to. This is accomplished by dedicating time to write, maintain, and have testers follow scripted test scenarios to achieve an acceptable level of testing coverage and to deliver pass/fail reporting across those structured test suites.
Finally, we have Test Automation. Test Automation is just a faster way to execute structured functional regression testing. We increase speed and reduce the cost of high volume, high coverage, repeatable structured testing. We accomplish this by working with our customers to understand their regression needs, and then creating automated suites of robust and stable automated scenarios. These scenarios are executed to achieve that same level of testing coverage and to deliver pass/fail reporting across those structured test suites.
Today we’ll be spending time on how we can find a balance between exploratory and structured manual testing and structured automation testing.
Drew
How do we implement these solutions for our customers?
We embed into our customer’s SDLC. We use in the wild manual testing and then shifting left we use Applause Automation framework to do more …. cover CI Server and integrations
Provide some context around the SDLC itself and how Applause can help facilitate. In the high level example, most of this works around your CI process.
The dev team builds
Smoke Test
Applause ITW manual Testing -> How does our in the wild testing work as part off the stakeholders making the decision.
Drew
Focus today is on UI and Service/Integration steps+best practices.
UI + Service is more expensive and the approach is more variable as it’s more product specific
We've done.. (recap)
Another thing to take on is...
Drew - Expertise, best practices and “lessons learned” leads to a tailored, strategic testing strategy.
Drew - A deployment pipeline oftentimes requires cross-team collaboration and organizational buy-in (e.g. DevOps)
Cathy- Focus more on maintenance here
Take a measured approach to scaling. Don’t build things you don’t need. Compliment Au with other services to keep it tenable.
Cathy- Continuous feedback must be integrated back into your agile processes so you can iterate, measure and make data-driven decisions when it comes to investment and risk maangement.
Drew
Next we’re going over your testing strategy
Cathy
Key areas and questions that we ask our customers
Drew –
A familiar model to help understand where you stand and where you need to go.
WALK THROUGH EXAMPLE OF EACH AREA
Regressive – Automation here may just amplify problems, confidence issues
Optimized – Analytics and Insights result in data-driven, real-time optimizations
Automation isn’t going to solve all of your problems. There may be a larger percentage of automation for optimized teams, but with such a high release velocity manual testers should be doing continuous exploratory and human-oriented test case execution.
Cathy
Drew
Drew - Highlight the collaboration points.
This is how we usually start
Relatively easy to implement, quick win
Quickly get feedback to development
Smoke test could initially run nightly, but ideally 10-15 minutes w/ parallel
Requires basic process around automated and manual collaboration
Low upfront investment
Drew
Add second layer of testing (could have started with API, either way)
Remember the triangle, could have 3-5x integration/API tests
Environments, can you do the same? Ok if not, but strive for that.
Forcing function on required process around automated and manual collaboration
Manual Test Execution In Parallel
Run Au Tests
For All Failures, determine:
Real Bug? Manually mark test as failed
Test needs update? Manually run, mark as passed, fin!
Environment issue? Manually run, mark as passed, investigate!
Applause values triage process highly, baking into product - You should too (Au+Mn+Ex processes should be defined, single place to send test results and deice on whether to ship)
Drew
Guidelines
Do “just enough” testing in each phase
Rapid, early feedback to developers
Build confidence at each quality gate before running more time consuming, costly automation
Cathy
Re-state: as we mentioned before, Different maturity levels requires to think strategically about the types of process, technical implementations, and reporting you need to manage the quality in your organization.
In this next section, we’ll go over a pretty detailed example, where we’ve helped one of our customers go through a maturity assessment and find the balance of exploratory, structure, and automated testing.
Cathy
Why are we talking about scalability?
Importance - as you scale make sure you have everything right before you do it. It’s an amplification process. You won’t see the scalability if the structure is weak. As soon as you amplify, everything else gets amplified.
We’re going to go through how to avoid that today.
Defining the process, training your team, implementing into the CI Piepline, and improving your process over time.
Cathy
Defining the process.
Doing the analysis, figuring out who do you have in your org, what their roles, what are your goals for functional testing and where are you right now. The process should define how these 3 types of testing methodologies work together as part of building, testing, and deploying your code. As your code is built and goes into integration or staging, think about all 3 of these types of testing and using it together at each of these different stages. Don’t think that they have to be separate. ** lead them a bit more here with guidance.
The contracts between them have to be solid, but
Develop a plan to figure out how are you going to get to that goal.
Setting up a process that works now, that pushes your org just enough that it gets better but doesn’t break things or is impossible to achieve.
Drew
This is a process we designed with a customer. As quickly as possible get hot fix release out the door. You’ll have many of these processes depending on your release cadence and the features. This was the process that was implemented given the maturity of the SDLC of where the team was. This may be too slow for you or this may be too fast if you don’t any automation. Take an incremental approach given where you are.
**Go into how this was designed and why it worked.
Cathy
Make sure everyone is aligned and trained on any new or different roles. So many times when implementing a new process or trying to
Make sure to train how to communicate - that will make or break your team. For example, at one such hand off, let’s say your automation team comes back and says that you have 100 test that have failed. This may be a pretty vital. However, if your product or engineering team tells the automation team that changes have been made to the application, then the automation can come and say - of the tests that weren’t impacted, only 5 failed. We have to go take the time to update 95 tests
Do a dry run for your process and see how it works. Get everyone’s feedback and create buy in.
Drew
Developer or Release Manager needs to be trained to let manual testers know to validate the bug. Reference the bug in the Git history, links, and release notes
Manual Test Lead needs to know to immediately test the bugs.
Development team needs to understand they’re getting a quick feedback on the bugs and needs to iterate really quickly.
Start the automated test and needs to let engineering know that testing is ready and that dev and product need to be ready to triage.
Continuous testing part of the pipeline will halt if it’s not all green. Kick back to development, if there’s an automation related failure.
Train them to the quality gate owner. Promote this to the next build stage
Do a trial, the automated team backed the team manually. This will help build confidence for everyone. This is multiple teams working together.
Cathy
Need a holistic view of the results at that gate. If your automated and manual results are in different places in different formats, you’ll have a hard time understanding the quality of your application
Notifications when processes start and end.
Results
Unified Results - if the manual testers are reporting off of a different set of data than the automation tests, it doesn’t help you reach an easy conclusion.
Auditing - you have to understand what went wrong in the process. If something went wrong you need to know why. Also knowing who the players were and why it went wrong to support continuous improvement. Need fast access to all the data points to figure out what went wrong to rectify the situation.
as process matures, continuous improvement makes it faster. To know which test cases are failing, what build, etc.
The earlier you find the bugs, the more money you save so you want that audit trail early on.
Notifications when things are starting and ending
Notification of results
Quality gates at each step
** use ticketing system, email, etc. If you don’t have all the people to bake into a CI Pipeline. This provides an audit trail as well. REally important to do this early on before this scales so that you can scale later.
The earlier you find the bugs, the more money you save so you want that audit trail early on.
Always strive for a way to get feedback and find ways to…
Drew
Shifted the automation left
Cathy
Cathy
Crawl before you can walk and sprint, be honest and set yourself up for success. If you falsely inflate your organizations maturity and implement processes that are too advanced, their will be numerous problems and difficulties. If your org is at a very low maturity level, focus on getting a single test running as part of the build process, and continuously improve from there!
Cathy – Tweak
Keep the train moving, build confidence, don’t try to shift culture overnight
Drew
Triage and interpret results
This may seem trivial initially but if you don't do it, it'll slow down your team.
Drew
If you don’t understand your results your quality gates aren’t gates. Your TCM or custom solution should paint this picture for you quickly.
Good? Ship it?
Bad? Get feedback quickly to development and iterate again.
Build-over-build – Is my strategy providing value?
Delivery pipeline is your foundation of orchestrating your manual and automated tests in a streamlined fashion.
Drew
Once you've defined your process and done some iterations, it's really important to continuously look at your process and improve over time.
The process never ends. That will get you to the highest level of functional testing maturity model