Why Automate?
Your application will grow, you will not have enough hands
You are blocked by development
Hidden factory costs of bug-fix cycle
Why Shift-Left?
More people to negate massive inspections
Define measurable success early, work on good parts.
Reduce occurrence of defects
What is this got to do with Ways of working?
Unlock capacity
Make people smile
Is not
a Department
extra cost
a final oversight or a massive inspection
someone else’s job
Is
Everyone’s responsibility
Build into the ways of working
Everyone’s job
2. About Me
Max Barrass
0410 533 731
max.barrass@anchora.com.au
Adobe
Technology
Leadership
● Practicing Architect - architecture and development
○ https://www.linkedin.com/in/maxbarrass/
○ Certified AEM Dev, DevOps and Architect.
○ 16 AEM projects, AU/HK/NZ
● Blogs
○ https://aem.design/
○ https://maxbarrass.com/
● Code
○ https://github.com/wildone
○ https://github.com/paulrohrbeck/aem-links
● This Presentation
○ More info in Workshop 1
○ Try it out Source
○ Reviewers: Igor Arkhipov
6. Terminology Recap
● Ways of working - is a social process with an aim to achieve value
● Automation Testing - is an automation of an otherwise manual process.
● Shift-Left - is process of moving quality assurance activities earlier in the ways of working process
● Good Automated Feedback Practices - good ones that don’t make you cry…
● AEM Platform - AEM is part of an ecosystem with interlinked features…
7. Terminology Questions!
● Why Automate?
○ Your application will grow, you will not have enough hands
○ You are blocked by development
○ Hidden factory costs of bug-fix cycle
● Why Shift-Left?
○ More people to negate massive inspections
○ Define measurable success early, work on good parts.
○ Reduce occurrence of defects
● What is this got to do with Ways of working?
○ Unlock capacity
○ Make people smile
9. Engineering Testing
● William Edwards Deming
○ Engineer testing activities into ways of working.
■ Cooperation and continual improvement for individual and organisation.
■ All mistakes are opportunities.
● Joseph Juran
○ Trilogy: Quality by Design, Quality Control, Quality Improvement
○ Pareto Principle (80/20)
● Edward Feigenbaum
○ Hidden Factory, focus on good pieces, bad work hidden in day-to-day
● Genichi Taguchi
○ Shift-Left - Push quality and reliability to planning stage to avoid defects
Read: https://www.juran.com/blog/the-history-of-quality/
11. Quality Principles
1. Create Constancy of Purpose
2. Adopt the New Philosophy
3. Cease Dependence on Mass Inspection
4. End the Practice of Awarding Business on Price Tag Alone
5. Improve Constantly and Forever the System of Production and Service
6. Institute Training and Retraining
7. Institute Leadership
8. Drive Out Fear
9. Break Down Barriers Between Staff Areas
10. Eliminate Slogans, Exhortations, and Targets for the Workforce
11. Eliminate Numerical Goals
12. Remove Barriers to Pride of Workmanship
13. Institute a Vigorous Program of Education and Self-Improvement
14. Take Action to Accomplish the Transformation
12. Quality is
● Is not
○ a Department
○ extra cost
○ a final oversight or a massive inspection
○ someone else’s job
● Is
○ Everyone’s responsibility
○ Build into the ways of working
○ Everyone’s job
14. Life without Testing
● A feature (A) translated into Requirements (RA)
● Requirements (RA) get translated into Code (B)
● Code (B) is released
15. Life with Testing
● A feature (A) translated into Requirements (RA)
● Requirements (RA) get translated into Code (B)
● Test if Requirements (RA) equal Code (B)
● Code (B) is released
20. Shift-Left Optimisation
Agree on
requirements
(sprint planning)
Agree on
Acceptance tests
(3 amigos)
Incorporate tests into
the dev process
(automation, dev/qa
collaboration)
Explore what can
be improved further
(exploratory
testing)
● How to establish ways of working that allow shifting-left of the automation testing effort?
○ Agree on requirements (sprint planning, structured)
○ Agree on acceptance tests (before work, 3 amigos, PO, Dev and Test)
○ Incorporate tests into dev process (after work, 3 amigos, automation review)
○ Explore what can be improved further (exploratory testing, requirements, acceptance tests)
21. Shift-Left Impact
● A feature (A) translated into Requirements (RA)
○ Test if Requirements (RA) can be Code (B)
● Requirements (RA) get translated into Code (B)
○ Test if Requirements (RA) equal Code (B)
● Code (B) is released
○ Test if Requirements (RA) is Code (B)
22. How to do this with AEM?
● Agree on Design Language System (DLS) that everyone speaks (not just designers)
● Agree on AEM Component Catalog that everyone speaks about and DLS is made for
● Agree on Requirements Convention
○ Create a blueprint/template to document each component and unique experience (page)
■ Component, Acceptance Requirements with Tests, Authoring and End-User Design
○ Use this for 3 amigos, before and after work, and improved with each iteration
● Get your CI/CD upfront, code commit = full test run
23. How to do this with AEM?
● Leverage code testing framework that combines Requirements and code.
○ Enables developers to use this to seed feature tests
○ Enables everyone to participate in testing definition
○ Look at hybrid spec+code tests to avoid tight coupling of tests.
● Establish feature showcase in AEM
○ What to showcase
■ One page/section per component
■ Unique Experiences (Pages)
■ Content Structures
○ Use this for Automation testing
■ Automation Test results are owned by PO as the proof that product meets requirements.
○ Each feature update will mean update to the showcase and automation test specs
24. What the process will look like?
Requirements
Code
Automation
Tests
3 amigos 3 amigos Release
31. What and how much should we test?
● Author
○ Test dialogs, as a base can you open dialog?
○ When you have custom dialog components or logic, test that specifically.
● Publish
○ Test your experience interaction
● Screenshot
○ Test all viewports/breakpoints (~6)
Don’t test AEM functionality, unless you modified it.
As much as you need to provide confidence to yourself.
33. Automation Testing Tooling
● Docker and Docker-Compose
○ everyone can use same artifact. (mac, windows, linux)
● Gebish - https://gebish.org/
○ Allows us to run Groovy and drive Selenium like a boss
● Spock Framework - https://spockframework.org/
○ Allows us to make cool Specs in BDD together with code and create cool reports
● Maven
○ for AEM content and running your tests
● ImageMagic
○ for comparing images
● AsciiDoctor
○ for making awesome reports in HTML or PDF or both
● Github Actions
○ Running all of this in the ether
36. Docker Compose Orchestration
● Traefik - teriffick proxy makes all this possible
● Web - simple page to help everyone, can have your docs etc
● Author - your AEM instance, author and publish (author.localhost)
● Publish - testing site from dispatcher perspective (publish.localhost)
● Dispatcher - test your dispatcher rules (dispatcher.localhost)
● Testing - container for executing your tests
● TestingPrep - ansible playbook to wait for AEM, potentially update AEM instance
● Selenium Hub - make a farm of nodes
● Selenium Node - an individual instance of a browser
37. What if adoption is not amazing?
Well you can still do the tests yourself :D
1. Fork the repo
2. Add your dependencies
3. Make your showcase (import/export from aem)
4. Watch the pipelines (github)
5. Impress people
39. You can follow at home
● If you have docker clone https://github.com/aem-design/geb-aem-testing
● docker-compose up -d
○ Start the stack
● docker-compose up testingprep
○ Ensure author is ready
● docker-compose up authordeploy
○ Deploy our showcase
● docker-compose up testingprep
○ Ensure author is ready
● docker-compose up testing
○ Run the test suite
● docker-compose up testingcheck
○ Check the result