The document provides guidance on establishing an effective testing strategy that utilizes both manual and automated testing. It advocates that manual testing is time-consuming and reduces productivity and confidence, while automated testing improves maintenance, feedback, and regression when implemented properly. The key steps outlined are gaining stakeholder buy-in, building quality into the process, focusing initial automation on builds and unit tests, adding integration and acceptance testing, and establishing visibility and accountability measures.
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
Test Like A Badger
1. TEST LIKE A BADGER
by: Mike Badger
mike@devBadger.com
801-231-4692
2. MANUALTESTING SUCKS
Sucks time and resources
Sucks development
productivity
Sucks product confidence out
of the customers and
stakeholders
ManualTesting
UI
System
Integration
Unit
3. AUTOMATEDTESTING ROCKS
(WHEN DONE RIGHT)
UI
System
Integration
Unit
Majority of test maintenance is
done in tandem with
development
Feedback is reliable ,
continuous and same or next
day.
Regression is built in at every
level of testing and is reliably
reported.
Ability to test early and often
means bugs are found and fixed
as they are created
Manual
Exploratory
4. WHERE DOWE GET STARTED?
Visibility Accountability
Then
Automation
5. FROM ASSUMPTIONSTO
MEASURABLES
Visibility
Where are we?
Where do we want to be?
How do we get there?
How will we measure our progress?
Establish:
• How to create and execute test
plans
• How to report, track bugs &
suggestions
• Prioritization process
• Measure turning manual tests
into automated tests
• How to make progress visible to
stakeholders
6. Visibility Accountability
1 – BUY IN FROM STAKEHOLDERS
Are the quality and reliability of our software being sold as a
feature?
Development and QA must have a symbiotic relationship
Create natural consequences for not breaking the build
(you break it, you fix it)
Visually report quality metrics on centrally located monitor/TV
(X Days Since Last Incident, Kanban, Scrum, Burndown…)
Celebrate successes in quality across company
8. GETTING BEST ROI
FROM AUTOMATION
It is important to build a
testing solution that is:
Worth the investment
Scalable with the growing
product.
Has an automated DNA
Not trading headaches
(brittle software -> brittle tests)
Number ofTests
CostofExecution
Unit
Integration
System
UI
Manual
Visibility Accountability Automation
9. 1 - AUTOMATE BUILD PROCESS
Create lightweight,
portable, self-
sufficient
containers.
Orchestrates which
docker containers to
build to setup
necessary
environment
Set up to respond
to SVN / Github
commits.
Calls Ansible to
initiate build
Visibility Accountability Automation
12. 2- REVIEW UNITTESTS
• Important to distinguish unit tests
from integration tests.
• TDD mindset essential.Code
written as result of tests.Test not
written as result of code.
Java Backend
• JUnit,TestNG, Spock,
Groovy…
• Makito, EasyMock, JMock…
AngularJS Frontend
• Grunt, Jasmine, Mocha,
Qunit…
Visibility Accountability Automation
13. 3 - INTEGRATIONTESTING
Run as part of build process
separate from UnitTests
Java Backend
• JBehave for BDD
• JUnit,TestNG, Spock,Groovy for
TDD
• Selenium, PhantomJS, Cucumber
for browser testing.
• “RealTest” frameworks like
Arquillian
AngularJS Frontend
• Karma, CasperJS…
Visibility Accountability Automation
14. Visibility Accountability Automation
4- ACCEPTANCETESTING
Manual
External
Stakeholders
Customer Partners Regulatory
Internal
Stakeholders
Sales Support Training
Can I
sell it?
Can I
support it?
Can I train
the customer
on it?
Is it
intuitive?
Is it easy
to use?
Can I
integrate
with it?
Does it meet
industry
standards?
Is it legal?
15. Functional
Does the software work?
• End-to-end tests not able to be easily covered with integration tests.
• Based on user stories.
• To automate or not to automate?
Depends
Performance
Does the software work well?
• Test speed (how fast)
• Test load (how many)
• Test stability (how long)
5-SYSTEMTESTING (NEXT LEVEL)
Visibility Accountability Automation
Editor's Notes
Visibility
Understand the system, how it is supposed to work
Get desired state
Every automation script written will require
Stakeholder interviews
Document trawling
Manual investigation
Jumping into automation without understanding current state is suicide
Accountability
How will we gauge the success of automation efforts?
How will we maintain their quality over time?
Must be able to validate the state of configuration at any point in time.
Visibility of current state
What have we been testing?
How have we been testing it?
What documentation exists?
Where are our biggest risks?
How do we report, fix and re-test bugs?
Automation in our DevOps DNA
Work with development team to solidify automated build process
Tests need to run with build process
Tests need to run in applicable containers
Need to be able to scale test servers for various environments and implementations
Internal - Do objects work with each other?
External – Do objects work correctly with services we rely on that are outside of our control?
When objects we interface with fail, have we provided a way for it to fail gracefully?
Run separate from Unit Test Builds.