Brighton CRO Meetup #1 - Oh Boy These AB tests Sure Look Like Bullshit to MeCraig Sullivan
An updated deck of a short talk (30m) given at the first Brighton CRO meetup. Contains useful AB testing tools as well as full speaker notes for most of the slides.
#Measurecamp : 18 Simple Ways to F*** up Your AB TestingCraig Sullivan
An expanded deck of the top 18 blockers to getting successful AB or Multivariate test results. In this deck, you get a complete checklist of the stuff you need to prepare, watch, launch and monitor your testing, so it gets you the *right* conclusions.
Cross Device Optimisation - Google Analytics ShortcutsCraig Sullivan
In this session, we explain how to mine GA for broken device experiences, flows, funnel blocks and more... Using a new grid tool we've developed, you can pull multi-dimensional segmented funnel and metric data from Google Analytics - we explain how it works, why you need it and what problems it solves. Find where your site is leaking money through data
Slides to go with a talk on rapid, lightweight research you can do before tackling a landing page, funnel step or lead-gen form. Comes complete with all the Google Analytics reports you'll need to mine useful data to share! Less bullshit, more truth in meetings!
Surviving the AB Testing Hype Cycle - Reaktor Breakpoint 2015Craig Sullivan
My Slides from Reaktor Breakpoint 2015 - This is by far the best deck (and hopefully talk) I've done this year. Masses of info, reading, articles, useful reports and more.
Brighton CRO Meetup #1 - Oh Boy These AB tests Sure Look Like Bullshit to MeCraig Sullivan
An updated deck of a short talk (30m) given at the first Brighton CRO meetup. Contains useful AB testing tools as well as full speaker notes for most of the slides.
#Measurecamp : 18 Simple Ways to F*** up Your AB TestingCraig Sullivan
An expanded deck of the top 18 blockers to getting successful AB or Multivariate test results. In this deck, you get a complete checklist of the stuff you need to prepare, watch, launch and monitor your testing, so it gets you the *right* conclusions.
Cross Device Optimisation - Google Analytics ShortcutsCraig Sullivan
In this session, we explain how to mine GA for broken device experiences, flows, funnel blocks and more... Using a new grid tool we've developed, you can pull multi-dimensional segmented funnel and metric data from Google Analytics - we explain how it works, why you need it and what problems it solves. Find where your site is leaking money through data
Slides to go with a talk on rapid, lightweight research you can do before tackling a landing page, funnel step or lead-gen form. Comes complete with all the Google Analytics reports you'll need to mine useful data to share! Less bullshit, more truth in meetings!
Surviving the AB Testing Hype Cycle - Reaktor Breakpoint 2015Craig Sullivan
My Slides from Reaktor Breakpoint 2015 - This is by far the best deck (and hopefully talk) I've done this year. Masses of info, reading, articles, useful reports and more.
Web Analytics Wednesday - Session Replay Tools are VitalCraig Sullivan
Session Replay or Screen Recording tools are now part of an arsenal of discovery toolkits that can drive optimisation, bug fixes, funnel and journey analysis - using qual and quant techniques. Without these tools, the analytics data misses emotion, frustration, friction and more - I've collated the best tips, tricks, tools and approaches to yield the most valuable insights for CRO / Growth Hacking.
Myths and Illusions of Cross Device Testing - Elite Camp June 2015Craig Sullivan
A compendium of the most common mistakes and problems people encounter when trying to optimise or split test cross device experiences (mobile, tablet, desktop, app, tv etc.)
The Neuromarketing Toolkit - Chinwag Psych - 4 Feb 2014Craig Sullivan
A practical toolkit for getting inside customers heads, in order to design and create persuasive psychological approaches to copy, pages, buttons, designs and your entire service. Craig shows you here how to mine what you already have - to design a better bank balance and continuously improving future for your company, staff and your customers.
Surviving the hype cycle Shortcuts to split testing successCraig Sullivan
In this talk, I show the key shortcuts to stop doing stupid testing and move towards innovative and transformative design & build methodologies, including innovation through split testing exploration
Myths, Lies and Illusions of AB and Split TestingCraig Sullivan
What are the common assumptions about AB (split) testing that are wrong? What are the lies told by vendors, consultants and the stuff you have convinced yourself about. What is illusory - what can you trust - what's it really all about. 20 top myths debunked after asking fellow CRO professionals what is on THEIR top list.
12 Things to do Before Your Company Dies : Conversion Conference London - Oct...Craig Sullivan
A roundup of all the things to help you maintain a competitive edge in experience design and conversion optimisation. With examples of companies putting this stuff together, the tools they are using and their project management approaches, this presentation delves deeper into the cultural aspects of CRO.
Product design is Poo - And how to fix it!Craig Sullivan
A look at why product design is still so poor, even after 22 years of digital design work. Why do these problems exist and how can we remove them from the way we build products? Lean corporate and startup growth models are explored in the solutions to this horrendous problem!
Condensed testing syrup - @OptimiseorDie @sydney sep 2011 - 4 years of testin...Craig Sullivan
A summary of my 4 years of A/B and Split testing, with case studies of work, photography guidelines, and key advice on which elements of the page to test for quick wins.
I enjoyed giving this talk at the Online Retailer Conference in Sydney, which is a fine place to visit.
The presentation c.overs a really good 'pizza' analogy for explaining testing to senior management and budget holders. Then covering how you go about discovering good places to test on your site, and what tools will help you get that data.
Lastly, I explore what worked for me in testing, show some examples of how similar our winners are across the globe and then cover some cross channel testing. The last one here is a big growth area and involves optimising contact centres and channels, using the web as a tool. Some interesting work going on here and I show some of ours, as well as new things in the pipeline.
There are some great resources attached, including a list of remote user testing services and the best 'guides' I could find on 'Conversion Rate Optimisation'. Hope you enjoyed the talk and thank you Sydney.
Mobile presentation - Sydney Online Retailer - 26 Sep 2011Craig Sullivan
In this presentation, I use analytics data from our global mobile reach, to illustrate the trends that are driving growth, how to take opportunity from them and what to do with your own site. I present a case for device and user knowledge, to allow you to optimise conversion rates, revenue and delight for visitors.
Product Design is Poo - And we're all going to dieCraig Sullivan
A humorous presentation about what is wrong with the current way of building digital products. Showing what is wrong, explaining the signs and giving you a checklist for reforming your company - are laid out with links, resources and further reading.
The top reasons and solutions for not getting value out of your AB tests - some practical tips for designing insightful and correctly instrumented test
20 Ways to Shaft your Split Tesring : Conversion ConferenceCraig Sullivan
This talk is the latest deck showing common problems that will easily break or skew your ab and multivariate testing results. Avoid these problems by following the simple advice in this deck!
Onboard like a juggernaut - Elite camp 2015Conversionista
Conversionista's presentation at Digital EliteCamp in Estonia 2105.
Find the critical conversion points in your SaaS user onboarding journey.
- Get Registered Prospects to use the service
- Get Active users to pay up
- Get Paying Customers to stay
- Get churned users to return
- Get all of them to refer more users
Do this - And rock!
Why Does My Conversion Rate Suck? Craig Sullivan, Senior Optimisation Consult...PRWD
Craig Sullivan, Senior Optimisation Consultant covers the top 10 reasons why your conversion rate might suck. Packed with actionable tips and resources this presentation is for anyone wanting to improve their Conversion Optimisation. Craig covers common problems and topic areas such as issues with Google Analytics setup, inputs, tools, testing, testing cycles, product cycles, Photo UX, how to analyse statistics / Data, segmentation, multiple channel optimisation. The resource pack also include a maturity model, Crowd-sourced UX, collaborative tools, testing tools for CRO & QA, Belron Methodology example, and CRO and testing resources.
SAMPLE SIZE – The indispensable A/B test calculation that you’re not makingZack Notes
If you’re a marketer it’s very likely that you’ve run an A/B test. It’s also likely that you’ve never calculated the sample size for your tests, and instead, you run tests until they reach statistical significance. If this is the case, your strategy is statistically flawed. Conforming to sample size requires marketers to wait longer for test results, but choosing to ignore it will bear false positives and lead to bad decisions.
This deck was created for an email audience for there are valuable lessons for anyone who runs A/B tests.
AB Testing and UX - a love story with numbers and people (by Craig Sullivan a...Northern User Experience
AB Testing and UX - a love story with numbers and people
Slides from the NUX6 talk by Craig Sullivan, Friday 27th October 2017.
2017.nuxconf.uk / nuxuk.org
Synopsis:
What’s wrong with the web these days? The mobile experience sucks. The customer experience sucks. It doesn’t work. It’s too hard to use. The text is too small. Nobody measures this happening. The interaction patterns suck. Nobody ever calls up to complain but nobody does anything anyway. Millions of people lose countless days to friction, poor design and frustrating moments on their devices.
There may be thousands of things you can fix that look promising – but how do you know where to start? What if you could measure what sucked, where it sucked and how big the problem was? Using lightweight research methods and tools, you can stop making excuses and start knowing exactly what to do. Life becomes much simpler and easier with a scientific method of optimising growth or delight within your product.
Craig has trained over 500 people on how to measure and optimise their product experience, finding 100M of ‘lost revenue’ using just one of the techniques you will learn. With reports, checklists, downloadable templates and toolkits for every budget and stage of growth – you can stop guessing tomorrow.
Web Analytics Wednesday - Session Replay Tools are VitalCraig Sullivan
Session Replay or Screen Recording tools are now part of an arsenal of discovery toolkits that can drive optimisation, bug fixes, funnel and journey analysis - using qual and quant techniques. Without these tools, the analytics data misses emotion, frustration, friction and more - I've collated the best tips, tricks, tools and approaches to yield the most valuable insights for CRO / Growth Hacking.
Myths and Illusions of Cross Device Testing - Elite Camp June 2015Craig Sullivan
A compendium of the most common mistakes and problems people encounter when trying to optimise or split test cross device experiences (mobile, tablet, desktop, app, tv etc.)
The Neuromarketing Toolkit - Chinwag Psych - 4 Feb 2014Craig Sullivan
A practical toolkit for getting inside customers heads, in order to design and create persuasive psychological approaches to copy, pages, buttons, designs and your entire service. Craig shows you here how to mine what you already have - to design a better bank balance and continuously improving future for your company, staff and your customers.
Surviving the hype cycle Shortcuts to split testing successCraig Sullivan
In this talk, I show the key shortcuts to stop doing stupid testing and move towards innovative and transformative design & build methodologies, including innovation through split testing exploration
Myths, Lies and Illusions of AB and Split TestingCraig Sullivan
What are the common assumptions about AB (split) testing that are wrong? What are the lies told by vendors, consultants and the stuff you have convinced yourself about. What is illusory - what can you trust - what's it really all about. 20 top myths debunked after asking fellow CRO professionals what is on THEIR top list.
12 Things to do Before Your Company Dies : Conversion Conference London - Oct...Craig Sullivan
A roundup of all the things to help you maintain a competitive edge in experience design and conversion optimisation. With examples of companies putting this stuff together, the tools they are using and their project management approaches, this presentation delves deeper into the cultural aspects of CRO.
Product design is Poo - And how to fix it!Craig Sullivan
A look at why product design is still so poor, even after 22 years of digital design work. Why do these problems exist and how can we remove them from the way we build products? Lean corporate and startup growth models are explored in the solutions to this horrendous problem!
Condensed testing syrup - @OptimiseorDie @sydney sep 2011 - 4 years of testin...Craig Sullivan
A summary of my 4 years of A/B and Split testing, with case studies of work, photography guidelines, and key advice on which elements of the page to test for quick wins.
I enjoyed giving this talk at the Online Retailer Conference in Sydney, which is a fine place to visit.
The presentation c.overs a really good 'pizza' analogy for explaining testing to senior management and budget holders. Then covering how you go about discovering good places to test on your site, and what tools will help you get that data.
Lastly, I explore what worked for me in testing, show some examples of how similar our winners are across the globe and then cover some cross channel testing. The last one here is a big growth area and involves optimising contact centres and channels, using the web as a tool. Some interesting work going on here and I show some of ours, as well as new things in the pipeline.
There are some great resources attached, including a list of remote user testing services and the best 'guides' I could find on 'Conversion Rate Optimisation'. Hope you enjoyed the talk and thank you Sydney.
Mobile presentation - Sydney Online Retailer - 26 Sep 2011Craig Sullivan
In this presentation, I use analytics data from our global mobile reach, to illustrate the trends that are driving growth, how to take opportunity from them and what to do with your own site. I present a case for device and user knowledge, to allow you to optimise conversion rates, revenue and delight for visitors.
Product Design is Poo - And we're all going to dieCraig Sullivan
A humorous presentation about what is wrong with the current way of building digital products. Showing what is wrong, explaining the signs and giving you a checklist for reforming your company - are laid out with links, resources and further reading.
The top reasons and solutions for not getting value out of your AB tests - some practical tips for designing insightful and correctly instrumented test
20 Ways to Shaft your Split Tesring : Conversion ConferenceCraig Sullivan
This talk is the latest deck showing common problems that will easily break or skew your ab and multivariate testing results. Avoid these problems by following the simple advice in this deck!
Onboard like a juggernaut - Elite camp 2015Conversionista
Conversionista's presentation at Digital EliteCamp in Estonia 2105.
Find the critical conversion points in your SaaS user onboarding journey.
- Get Registered Prospects to use the service
- Get Active users to pay up
- Get Paying Customers to stay
- Get churned users to return
- Get all of them to refer more users
Do this - And rock!
Why Does My Conversion Rate Suck? Craig Sullivan, Senior Optimisation Consult...PRWD
Craig Sullivan, Senior Optimisation Consultant covers the top 10 reasons why your conversion rate might suck. Packed with actionable tips and resources this presentation is for anyone wanting to improve their Conversion Optimisation. Craig covers common problems and topic areas such as issues with Google Analytics setup, inputs, tools, testing, testing cycles, product cycles, Photo UX, how to analyse statistics / Data, segmentation, multiple channel optimisation. The resource pack also include a maturity model, Crowd-sourced UX, collaborative tools, testing tools for CRO & QA, Belron Methodology example, and CRO and testing resources.
SAMPLE SIZE – The indispensable A/B test calculation that you’re not makingZack Notes
If you’re a marketer it’s very likely that you’ve run an A/B test. It’s also likely that you’ve never calculated the sample size for your tests, and instead, you run tests until they reach statistical significance. If this is the case, your strategy is statistically flawed. Conforming to sample size requires marketers to wait longer for test results, but choosing to ignore it will bear false positives and lead to bad decisions.
This deck was created for an email audience for there are valuable lessons for anyone who runs A/B tests.
AB Testing and UX - a love story with numbers and people (by Craig Sullivan a...Northern User Experience
AB Testing and UX - a love story with numbers and people
Slides from the NUX6 talk by Craig Sullivan, Friday 27th October 2017.
2017.nuxconf.uk / nuxuk.org
Synopsis:
What’s wrong with the web these days? The mobile experience sucks. The customer experience sucks. It doesn’t work. It’s too hard to use. The text is too small. Nobody measures this happening. The interaction patterns suck. Nobody ever calls up to complain but nobody does anything anyway. Millions of people lose countless days to friction, poor design and frustrating moments on their devices.
There may be thousands of things you can fix that look promising – but how do you know where to start? What if you could measure what sucked, where it sucked and how big the problem was? Using lightweight research methods and tools, you can stop making excuses and start knowing exactly what to do. Life becomes much simpler and easier with a scientific method of optimising growth or delight within your product.
Craig has trained over 500 people on how to measure and optimise their product experience, finding 100M of ‘lost revenue’ using just one of the techniques you will learn. With reports, checklists, downloadable templates and toolkits for every budget and stage of growth – you can stop guessing tomorrow.
Conversion Optimization: The World Beyond Headlines & Button ColorOptimizely
Patrick McKenzie, Software Developer at Kalzumeus Software
You've run A/B tests.
You're long past the point where "Guess Which Alternative Won" articles teach you something useful.
- How do you get to the next level of conversion optimization?
- How do you get it to be a repeatable team effort?
- How do you retain organizational know-how about previous tests?
- How do you track conversions over more involved funnels?
- How do you keep the team onboard with testing for years at a time?
- How do you manage the engineering aspects of changes which are more in--depth than changing the marketing site?
This session delves into the "Here There Be Dragons" parts of the testing map. Patrick McKenzie, Software Developer at Kalzumeus Software and Bingo Card Creator, has helped a dozen companies through testing maps, and he presents some of the mistakes he's made so that you don't have to repeat them—and also a success or two.
Patrick McKenzie Opticon 2014: Advanced A/B TestingPatrick McKenzie
A/B Testing Beyond Headlines and Button Colors -- ideas for tests (particularly for B2B SaaS), common pitfalls in organizations, and how to overcome them.
One of the most commonly asked questions is “when is an MVT experiment or AB test finished?”
Is it at 30 days...? 100 conversions...? 10,000 visitors...?
The short answer is... it depends.
Fail and Win: Why a Failed Test Isn’t a Bad ThingOptimizely
Caleb Whitmore, CEO, Analytics Pros
Ryan Lillis, Strategic Optimization Consultant, Optimizely
Here's something you don't expect to hear at a CRO conference: most A/B tests don't produce a variation that's better than what you already have.
If all you're doing is running an A/B test, viewing select metrics, and giving a "thumbs up" or "thumbs down," you won't have a successful optimization program — even if you happen upon a few "winners."
But you don't have to run your optimization program this way.
A/B testing done right allows you to draw winning insights from "losing" tests that have the power to genuinely affect your business.
Caleb Whitmore, Founder and CEO of Analytics Pros, shows that you can achieve a 360-degree view of data that leverages your analytics engine as well as your testing platform to drive deep and genuine insights about the effects of your tests.
You'll learn a holistic approach to testing that goes way beyond "winners" and "losers."
Pitfalls of product marketing and How Business Requirements Can Make Your Pro...Eliza Dumitrache
The presentation comprises elements of tracking sales and user behavior that are essential for a speedy and successful sales kick off, profitability and business development and the importance of involving the Marketing Department in product development.
Featuring speakers from Optimizely
Ash Alhashim, Global Director, Sales and Market Development, Optimizely
Amanda Swan, Community Manager, Optimizely
Adam Avramescu, Head of Customer Education, Optimizely
Ever wondered how Optimizely employees use Optimizely? In this session, you’ll learn about our optimization programs, and how we use Optimizely beyond traditional A/B testing. We’ll also uncover how to optimize experiences that don’t have a traditional “conversion” goal. Our Global Director of Sales and Market Development, Ash Alhashim, will walk through an advanced experiment that takes advantage of Optimizely’s most powerful features and how he takes a unique approach to measuring success, perfect for optimizers looking to push their optimization boundaries. Amanda Swan and Adam Avramescu will also share insights from the Optiverse testing program and thoughts on interpreting experiments on properties where “conversion” is unclear.
Allison MacLeod, Sr. Director of Demand Gen at Rapid7 presented "Making Predictive Analytics Work" at the MassTLC sales and marketing conference, March 2016
3 TED style talks of 15-20 minutes, featuring:
(1) Conversion methodologies, Lean UX and Agile? What gives?
(2) #Measurecamp and my Top Analytics Tips of 2013
(3) Conversion tools of the CRO masters
The tools used by the CRO masters round the world to optimise analytics, UX, VOC,insight and testing - all to optimise your insight or conversion figures.
Multi-cluster Kubernetes Networking- Patterns, Projects and GuidelinesSanjeev Rampal
Talk presented at Kubernetes Community Day, New York, May 2024.
Technical summary of Multi-Cluster Kubernetes Networking architectures with focus on 4 key topics.
1) Key patterns for Multi-cluster architectures
2) Architectural comparison of several OSS/ CNCF projects to address these patterns
3) Evolution trends for the APIs of these projects
4) Some design recommendations & guidelines for adopting/ deploying these solutions.
# Internet Security: Safeguarding Your Digital World
In the contemporary digital age, the internet is a cornerstone of our daily lives. It connects us to vast amounts of information, provides platforms for communication, enables commerce, and offers endless entertainment. However, with these conveniences come significant security challenges. Internet security is essential to protect our digital identities, sensitive data, and overall online experience. This comprehensive guide explores the multifaceted world of internet security, providing insights into its importance, common threats, and effective strategies to safeguard your digital world.
## Understanding Internet Security
Internet security encompasses the measures and protocols used to protect information, devices, and networks from unauthorized access, attacks, and damage. It involves a wide range of practices designed to safeguard data confidentiality, integrity, and availability. Effective internet security is crucial for individuals, businesses, and governments alike, as cyber threats continue to evolve in complexity and scale.
### Key Components of Internet Security
1. **Confidentiality**: Ensuring that information is accessible only to those authorized to access it.
2. **Integrity**: Protecting information from being altered or tampered with by unauthorized parties.
3. **Availability**: Ensuring that authorized users have reliable access to information and resources when needed.
## Common Internet Security Threats
Cyber threats are numerous and constantly evolving. Understanding these threats is the first step in protecting against them. Some of the most common internet security threats include:
### Malware
Malware, or malicious software, is designed to harm, exploit, or otherwise compromise a device, network, or service. Common types of malware include:
- **Viruses**: Programs that attach themselves to legitimate software and replicate, spreading to other programs and files.
- **Worms**: Standalone malware that replicates itself to spread to other computers.
- **Trojan Horses**: Malicious software disguised as legitimate software.
- **Ransomware**: Malware that encrypts a user's files and demands a ransom for the decryption key.
- **Spyware**: Software that secretly monitors and collects user information.
### Phishing
Phishing is a social engineering attack that aims to steal sensitive information such as usernames, passwords, and credit card details. Attackers often masquerade as trusted entities in email or other communication channels, tricking victims into providing their information.
### Man-in-the-Middle (MitM) Attacks
MitM attacks occur when an attacker intercepts and potentially alters communication between two parties without their knowledge. This can lead to the unauthorized acquisition of sensitive information.
### Denial-of-Service (DoS) and Distributed Denial-of-Service (DDoS) Attacks
1.Wireless Communication System_Wireless communication is a broad term that i...JeyaPerumal1
Wireless communication involves the transmission of information over a distance without the help of wires, cables or any other forms of electrical conductors.
Wireless communication is a broad term that incorporates all procedures and forms of connecting and communicating between two or more devices using a wireless signal through wireless communication technologies and devices.
Features of Wireless Communication
The evolution of wireless technology has brought many advancements with its effective features.
The transmitted distance can be anywhere between a few meters (for example, a television's remote control) and thousands of kilometers (for example, radio communication).
Wireless communication can be used for cellular telephony, wireless access to the internet, wireless home networking, and so on.
This 7-second Brain Wave Ritual Attracts Money To You.!nirahealhty
Discover the power of a simple 7-second brain wave ritual that can attract wealth and abundance into your life. By tapping into specific brain frequencies, this technique helps you manifest financial success effortlessly. Ready to transform your financial future? Try this powerful ritual and start attracting money today!
APNIC Foundation, presented by Ellisha Heppner at the PNG DNS Forum 2024APNIC
Ellisha Heppner, Grant Management Lead, presented an update on APNIC Foundation to the PNG DNS Forum held from 6 to 10 May, 2024 in Port Moresby, Papua New Guinea.
#Measurefest : 20 Simple Ways to Fuck Up your AB tests
1. 20 simple ways
to fuck up your AB testing
28th March 2014 @OptimiseOrDie
2. @OptimiseOrDi
e
• UX and Analytics (1999)
• User Centred Design (2001)
• Agile, Startups, No budget (2003)
• Funnel optimisation (2004)
• Multivariate & A/B (2005)
• Conversion Optimisation (2005)
• Persuasive Copywriting (2006)
• Joined Twitter (2007)
• Lean UX (2008)
• Holistic Optimisation (2009)
Was : Group eBusiness Manager, Belron
Now : Spareroom.co.uk
3. #1 : You’re doing it in the wrong
place
@OptimiseOrDie
4. #1 : You’re doing it in the wrong place
There are 4 areas a CRO expert always looks at:
1. Inbound attrition (medium, source, landing page, keyword,
intent and many more…)
2. Key conversion points (product, basket, registration)
3. Processes and steps (forms, logins, registration, checkout)
4. Layers of engagement (search, category, product, add)
1. Use visitor flow reports for attrition – very useful.
2. For key conversion points, look at loss rates & interactions
3. Processes and steps – look at funnels or make your own
4. Layers and engagement – make a ring model
@OptimiseOrDie
9. #1 : You’re doing it in the wrong place
• Get to know the flow and loss (leaks) inbound, inside and through key
processes or conversion points.
• Once you know the key steps you’re losing people at and how much
traffic you have – make a money model.
• Let’s say 1,000 people see the page a month. Of those, 20% (200)
convert to checkout.
• Estimate the influence your test can bring. How much money or KPI
improvement would a 10% lift in the checkouts deliver?
• Congratulations – you’ve now built the worlds first IT plan with a
return on investment estimate attached!
• I’ll talk more about prioritising later – but a good real world analogy
for you to use:
@OptimiseOrDie
10. Think like a store
owner!
If you can’t refurbish the entire
store, which floors or
departments will you invest in
optimising?
Wherever there is:
• Footfall
• Low return
• Opportunity
@OptimiseOrDie
11. Insight - Inputs
#FAIL
Competitor
copying
Guessing
Dice rolling
An article
the CEO
read
Competitor
change
Panic
Ego
Opinion
Cherished
notions
Marketing
whims Cosmic rays
Not ‘on
brand’
enough
IT
inflexibility
Internal
company
needs
Some
dumbass
consultant
Shiny
feature
blindness
Knee jerk
reactons
@OptimiseOrDie
#2 : Your Hypothesis is a piece of
crap
12. Insight - Inputs
Insight
Segmentation
Surveys
Sales and
Call Centre
Session
Replay
Social
analytics
Customer
contact
Eye tracking
Usability
testing
Forms
analytics
Search
analytics Voice of
Customer
Market
research
A/B and
MVT testing
Big &
unstructured
data
Web
analytics
Competitor
evalsCustomer
services
#2 : These are the inputs you
need…
@OptimiseOrDie
13. #2 : Solutions
• You need multiple tool inputs
– Tool decks are here : www.slideshare.net/sullivac
• Usability testing and User facing teams
– If you’re not doing these properly, you’re
hosed
• Session replay tools provide vital input
– Get vital additional customer evidence
• Simple page Analytics don’t cut it
– Invest in your analytics, especially event
tracking
• Ego, Opinion, Cherished notions – fill gaps
– Fill these vacuums with insights and data
• Champion the user @OptimiseOrDie
14. We believe that doing [A] for
People [B] will make
outcome [C] happen.
We’ll know this when we
observe data [D] and obtain
feedback [E]. (reverse)
@OptimiseOrDie
15. #3 : No analytics integration
• Investigating problems with tests
• Segmentation of results
• Tests that fail, flip or move around
• Tests that don’t make sense
• Broken test setups
• What drives the averages you see?
@OptimiseOrDie
17. These Danish
porn sites are
so hardcore!
We’re still
waiting for our
AB tests to
finish!
• Use a test length calculator like this one:
• visualwebsiteoptimizer.com/ab-split-test-duration/
#4 : The test will finish after you die
18. • The minimum length
– 2 business cycles (cross check)
– Usually a week, 2 weeks, Month
– Always test ‘whole’ not partial cycles
– Be aware of multiple cycles
– Don’t self stop!
– PURCHASE CYCLES – KNOW THEM
• How long after that
– I aim for a minimum 250 outcomes, ideally 350+ for each ‘creative’
– If you test 4 recipes, that’s 1400 outcomes needed
– You should have worked out how long each batch of 350 needs before you start!
– 95% confidence or higher is my aim BUT BIG SECRET -> (p values are unreliable)
– If you segment, you’ll need more data
– It may need a bigger sample if the response rates are similar*
– Use a test length calculator but be aware of BARE MINIMUM TO EXPECT
– Important insider tip – watch the error bars! The +/- stuff – let’s explain
* Stats geeks know I’m glossing over something here. That test time depends on how the two experiments separate in
terms of relative performance as well as how volatile the test response is. I’ll talk about this when I record this one!
This is why testing similar stuff sux.
#5 : You don’t test for long enough
@OptimiseOrDie
19. 95%, 99%, 99.99% Confidence – what’s that?
• It’s a stats thing
• Seriously, look at this one LAST in your testing
• Purchase Cycle, Business Cycles, Sample Size, Error bar
separation – ALL come before this one. Got it?
• Why? It’s to do with p-values. Read these articles:
• http://bit.ly/1gq9dtd
• If you rely on confidence, you are relying upon something
that’s unreliable and moves around, particularly early in
testing.
• Don’t be fooled by your testing package – watch the error
bars instead of confidence. 19
#5 : You put faith in the Confidence
value
21. Graph is a range, not a line:
9.1 ± 0.3%9.1 ± 0.9%9.1 ± 1.9%
@OptimiseOrDie
22. • The minimum length:
– 2 business cycles and > purchase cycle as a minimum, regardless of
outcomes. Test for less and you’re cutting.
– 250+, prefer 350+ outcomes in each
– Error bar separation between creatives
– 95%+ confidence (unreliable)
• Pay attention to:
– Time it will take for the number of ‘recipes’ in the test
– The actual footfall to the test – not sitewide numbers
– Test results that don’t separate – makes the test longer
– This is why you need brave tests – to drive difference
– The error bars – the numbers in your AB testing tool are not precise –
they’re fuzzy regions that depend on response and sample size.
– Sudden changes in test performance or response
– Monitor early tests like a chef!
#5 : Test Length Summary
@OptimiseOrDie
23. • Ignore the graphs. Don’t draw conclusions. Don’t dance. Calm down.
• Get a feel for the test but don’t do anything yet!
• Remember – in A/B - 50% of returning visitors will see a new shiny website!
• Until your test has had at least 1 business cycle and 250-350 outcomes, don’t bother even
getting excited!
• Watching regularly is good though. You’re looking for anything that looks really odd – your
analytics person should be checking all the figures until you’re satisfied
• All tests move around or show big swings early in the testing cycle. Here is a very high traffic
site – it still takes 10 days to start settling. Lower traffic sites will stretch this period further.
#6 : You suffer premature test
ejaculation
@OptimiseOrDie
24. #7 : No QA testing for the AB test?
@OptimiseOrDie
25. #7 - QA Test or Die!
• Over 40% of tests have had QA issues.
• It’s very easy to break or bias the testing
Browser testing www.crossbrowsertesting.com
www.browserstack.com
www.spoon.net
www.cloudtesting.com
www.multibrowserviewer.com
www.saucelabs.com
Mobile devices www.perfectomobile.com
www.deviceanywhere.com
www.mobilexweb.com/emulators
www.opendevicelab.com
@OptimiseOrDie
26. #8 : Opportunities are not prioritised
Once you have a list of potential
test areas, rank them by
opportunity vs. effort.
The common ranking metrics that I
use include these:
•Opportunity (revenue, impact)
•Dev resource
•Time to market
•Risk / Complexity
Make yourself a quadrant diagram
and plot them!
27. #9 : Your cycles are too slow
0 6 12 18
Months
Conversio
n
@OptimiseOrDie
28. #9 : Solutions
• Give Priority Boarding for opportunities
– The best seats reserved for metric shifters
• Release more often to close the gap
– More testing resource helps, analytics ‘hawk eye’
• Kaizen – continuous improvement
– Others call it JFDI (just f***ing do it)
• Make changes AS WELL as tests, basically!
– These small things add up
• RUSH Hair booking – Over 100 changes
– No functional changes at all – 37% improvement
• Inbetween product lifecycles?
– The added lift for 10 days work, worth 360k
@OptimiseOrDie
30. #10 : How do I know when it’s ready?
• The hallmarks of a cooked test are:
– It’s done at least 1 or preferably 2+ business and at least one if
not two purchase cycles
– You have at least 250-350 outcomes for each recipe
– It’s not moving around hugely at creative or segment level
performance
– The test results are clear – even if the precise values are not
– The intervals are not overlapping (much)
– If a test is still moving around, you need to investigate
– Always declare on a business cycle boundary – not the middle of
a period (this introduces bias)
– Don’t declare in the middle of a limited time period advertising
campaign (e.g. TV, print, online)
– Always test before and after large marketing campaigns (one
week on, one week off)
@OptimiseOrDie
32. #11: Your test fails
• Learn from the failure! If you can’t learn from the failure, you’ve
designed a crap test.
• Next time you design, imagine all your stuff failing. What would
you do? If you don’t know or you’re not sure, get it changed so
that a negative becomes insightful.
• So : failure itself at a creative or variable level should tell you
something.
• On a failed test, always analyse the segmentation and analytics
• One or more segments will be over and under
• Check for varied performance
• Now add the failure info to your Knowledge Base:
• Look at it carefully – what does the failure tell you? Which
element do you think drove the failure?
• If you know what failed (e.g. making the price bigger) then you
have very useful information
• You turned the handle the wrong way
• Now brainstorm a new test
@OptimiseOrDie
33. #12 : The test is ‘about the same’
• Analyse the segmentation
• Check the analytics and instrumentation
• One or more segments may be over and under
• They may be cancelling out – the average is a lie
• The segment level performance will help you (beware of
small sample sizes)
• If you genuinely have a test which failed to move any
segments, it’s a crap test – be bolder
• This usually happens when it isn’t bold or brave enough in
shifting away from the original design, particularly on
lower traffic sites
• Get testing again!
@OptimiseOrDie
34. • There are three reasons it is moving around
– Your sample size (outcomes) is still too small
– The external traffic mix, customers or reaction has
suddenly changed or
– Your inbound marketing driven traffic mix is
completely volatile (very rare)
• Check the sample size
• Check all your marketing activity
• Check the instrumentation
• If no reason, check segmentation
#13 : The test keeps moving
around
@OptimiseOrDie
35. • Something like this can happen:
• Check your sample size. If it’s still small, then expect this until the test
settles.
• If the test does genuinely flip – and quite severely – then something has
changed with the traffic mix, the customer base or your advertising. Maybe
the PPC budget ran out? Seriously!
• To analyse a flipped test, you’ll need to check your segmented data. This is
why you have a split testing package AND an analytics system.
• The segmented data will help you to identify the source of the shift in
response to your test. I rarely get a flipped one and it’s always something
#14 : The test has flipped on me
@OptimiseOrDie
36. • No – and this is why:
– It’s a waste of time
– It’s easier to test and monitor instead
– You are eating into test time
– Also applies to A/A/B/B testing
– A/B/A running at 25%/50%/25% is the best
• Read my post here :
http://bit.ly/WcI9EZ
#15 : Should I run an A/A test
first
@OptimiseOrDie
37. #16 : Nobody feels the
test
• You promised a 25% rise in checkouts - you only see 2%
• Traffic, Advertising, Marketing may have changed
• Check they’re using the same precise metrics
• Run a calibration exercise
• I often leave a 5 or 10% stub running in a test
• This tracks old creative once new one goes live
• If conversion is also down for that one, BINGO!
• Remember – the AB test is an estimate – it doesn’t
precisely record future performance
• This is why infrequent testing is bad
• Always be trying a new test instead of basking in the
glory of one you ran 6 months ago. You’re only as good
as your next test.
@OptimiseOrDie
38. #17 : You forgot about Mobile &
Tablet
• If you’re AB testing a responsive site, pay attention
• Content will break differently on many screens
• Know thy users and their devices
• Use bango or google analytics to define a test list
• Make sure you test mobile devices & viewports
• What looks good on your desk may not be for the user
• Harder to design cross device tests
• You’ll need to segment mobile, tablet & desktop response
in the analytics or AB testing package
• Your personal phone is not a device mix
• Ask me about making your device list
• Buy core devices, rent the rest from deviceanywhere.com
@OptimiseOrDie
39. • If small volumes, contact customers – reach out.
• If data volumes aren’t there, there are still customers!
• Drive design from levers you can apply – game the system
• Pick clean and simple clusters of change (hypothesis driven)
• Use a goal at an earlier ring stage or funnel step
• Beware of using clickthroughs when attrition is high on the
other side
• Try before and after testing on identical time periods
(measure in analytics model)
• Be careful about small sample sizes (<100 outcomes)
• Are you working automated emails?
• Fix JFDI, performance and UX issues too!
#17 : Oh shit – Low Traffic!
40. • Forget MVT or A/B/N tests – run your numbers
• Test things with high impact – don’t be a wuss!
• Use UX, Session Replay to aid insight
• Run a task gap survey (4Q style)
• Run a dropped basket survey (LF style)
• Run a general survey + check social + other sites
• Run sitewide tests that appear on all pages or large clusters
of pages –
• UVPs (“We are a cool brand”), USPs (“Free returns!”), UCPs
(“10% off today”).
• Headers, Footers, Nudge Bars, USP bars, footer changes,
Navigation, Product pages, Delivery info etc.
#17 : Low traffic site tips
41. • A/B testing – good for:
– A single change of content or design layout
– A group of related changes (e.g. payment security)
– Finding a new and radical shift for a template design
– Lower traffic pages or shorter test times
• Multivariate testing – good for:
– Higher traffic pages
– Groups of unrelated changes (e.g. delivery & security)
– Multiple content or design style changes
– Finding specific drivers of test lifts
– Testing multiple versions (e.g. click here, book now, go)
– Where you need to understand strong and weak cross variable
interactions
– Don’t use to settle arguments or sloppy thinking!
#17 : You chose the wrong kind of
test
42. #20 – Other flavours of testing
• Micro testing (tiny change) – good for:
– Proving to the boss that testing works
– Demonstrating to IT that it works without impact
– Showing the impact of a seemingly tiny change
– Proof of concept before larger test
• Funnel testing – good for:
– Checkouts
– Lead gen
– Forms processes
– Quotations
– Any multi-step process with data entry
• Fake it and Build it – good for:
– Testing new business ideas
– Trying out promotions on a test sample
– Estimating impact before you build
– Helps you calculate ROI
– You can even split test entire server farms
Vs.
43. #20 – Other flavours of testing
Congratulations!
Today you’re the lucky winner of our
random awards programme.
You get all these extra features for free,
on us. Enjoy!
44. Top F***ups for 2014
1. Testing in the wrong place
2. Your hypothesis inputs are crap
3. No analytics integration
4. Your test will finish after you die
5. You don’t test for long enough
6. You peek before it’s ready
7. No QA for your split test
8. Opportunities are not prioritised
9. Testing cycles are too slow
10. You don’t know when tests are ready
@OptimiseOrDie
11. Your test fails
12. The test is ‘about the same’
13. Test flips behaviour
14. Test keeps moving around
15. You run an A/A test and waste time
16. Nobody ‘feels’ the test
17. You forgot you were responsive
18. You forgot you had no traffic
19. You ran the wrong test type
20. You didn’t try all the flavours of testing
45. Is there a way to fix this then?
Conversion
Heroes!
@OptimiseOrDie
And here’s a boring slide about me – and where I’ve been driving over 400M of additional revenue in the last few years. In two months this year alone, I’ve found an additional ¾ M pounds annual profit for clients. For the sharp eyed amongst you, you’ll see that Lean UX hasn’t been around since 2008. Many startups and teams were doing this stuff before it got a new name, even if the approach was slightly different. For the last 4 years, I’ve been optimising sites using the combination of techniques I’ll show you today.
Tomorrow - Go forth and kick their flabby low converting asses