16. If it isn‟t working, you‟re not doing it right
@OptimiseOrDie
17. #1 : Your analytics is cattle trucked
@OptimiseOrDie
18. #1 : Your analytics is cattle trucked
@OptimiseOrDie
19. #1 : Common problems (GA)
• Dual purpose goal page
– One page used by two outcomes – and not split
• Cross domain tracking
– Where you jump between sites, this borks the data
• Filters not correctly set up
– Your office, agencies, developers are skewing data
• Code missing or double code
– Causes visit splitting, double pageviews, skews bounce rate
• Campaign, Social, Email tracking etc.
– External links you generate are not setup to record properly
• Errors not tracked (404, 5xx, Other)
– You are unaware of error volumes, locations and impact
• Dual flow funnels
– Flows join in the middle of a funnel or loop internally
• Event tracking skews bounce rate
– If an event is set to be „interactive‟ – it can skew bounce rate (example)
@OptimiseOrDie
20. #1 : Common problems (GA)
– EXAMPLE
20
Landing
1st
interaction
Loss
2nd
interaction
Loss
3rd
interaction
Loss
4th
interaction
Loss
55900 527 99.1% 66 87.5% 55 16.7% 33 40.0%
30900 4120 86.7% 2470 40.0% 1680 32.0% 1240 26.2%
21. #1 : Solutions
• Get a Health Check for your Analytics
– Try @prwd, @danbarker, @peter_oneill or ask me!
• Invest continually in instrumentation
– Aim for at least 5% of dev time to fix + improve
• Stop shrugging : plug your insight gaps
– Change „I don‟t know‟ to „I‟ll find out‟
• Look at event tracking (Google Analytics)
– If set up correctly, you get wonderful insights
• Would you use paper instead of a till?
– You wouldn‟t do it in retail so stop doing it online!
• How do you win F1 races?
– With the wrong performance data, you won‟t
@OptimiseOrDie
22. Insight - Inputs
#FAIL
Competitor
copying
Guessing
Dice rolling
An article
the CEO
read
Competitor
change
Panic
Ego
Opinion
Cherished
notions
Marketing
whims Cosmic rays
Not ‘on
brand’
enough
IT
inflexibility
Internal
company
needs
Some
dumbass
consultant
Shiny
feature
blindness
Knee jerk
reactons
#2 : Your inputs are all wrong
@OptimiseOrDie
23. Insight - Inputs
Insight
Segmentation
Surveys
Sales and
Call Centre
Session
Replay
Social
analytics
Customer
contact
Eye tracking
Usability
testing
Forms
analytics
Search
analytics Voice of
Customer
Market
research
A/B and
MVT testing
Big &
unstructured
data
Web
analytics
Competitor
evalsCustomer
services
#2 : Your inputs are all wrong
@OptimiseOrDie
24. #2 : Solutions
• Usability testing and User Centred design
– If you‟re not doing this properly, you‟re hosed
• Champion UX+ - with added numbers
– (Re)designing without inputs + numbers is guessing
• You need one team on this, not silos
– Stop handing round the baby (I‟ll come back to this)
• Ego, Opinion, Cherished notions – fill gaps
– Fill these vacuums with insights and data
• Champion the users
– Someone needs to take their side!
• You need multiple tool inputs
– Let me show you my core list
@OptimiseOrDie
25. #2 : Core tools
• Properly set up analytics
– Without this foundation, you‟re toast
• Session replay tools
– Clicktale, Tealeaf, Sessioncam and more…
• Cheap / Crowdsourced usability testing
– See the resource pack for more details
• Voice of Customer / Feedback tools
– 4Q, Kampyle, Qualaroo, Usabilla and more…
• A/B and Multivariate testing
– Optimizely, Google Content Experiments, VWO
• Email, Browser and Mobile testing
– You don‟t know if it works unless you check
@OptimiseOrDie
26. #3 : You‟re not testing (enough)
@OptimiseOrDie
27. #3 : Common problems
• Let’s take a quick poll
– How many tests do you complete a month?
• Not enough resource
– You MUST hire, invest and ringfence time and staff for CRO
• Testing has gone to sleep
– Some vendors have a „rescue‟ team for these accounts
• Vanity testing takes hold
– Getting one test done a quarter? Still showing it a year later?
• You keep testing without buyin at C-Level
– If nobody sees the flower, was it there?
• You haven’t got a process – just a plugin
– Insight, Brainstorm, Wireframe, Design, Build, QA
test, Monitor, Analyse. Tools, Process, People, Time -> INVEST
• IT or release barriers slow down work
– Circumvent with tagging tools
– Develop ways around the innovation barrier @OptimiseOrDie
29. #4 : Not executing fast enough
• Silo Mentality means pass the product
– No „one team‟ approach means no „one product‟
• The process is badly designed
– See the resource pack or ask me later!
• People mistake hypotheses for finals
– Endless argument, tweaking means NO TESTING – let the test
decide, please!
• No clarity : authority or decision making
– You need a strong leader to get things decided
• Signoff takes far too long
– Signoff by committee is a velocity killer – the CUSTOMER and
the NUMBERS are the signoff
• You set your target too low
– Aim for a high target and keep increasing it
@OptimiseOrDie
31. #4 : Execution solutions
• Agile, One Team approach
– Everyone works on the lifecycle, together
• Hire Polymaths
– T-shaped or just multi-skilled, I hire them a lot
• Use Collaborative Tools, not meetings
– See the resource pack
• Market the results
– Market this stuff internally like a PR agency
– Encourage betting in the office
• Smash down silos – a special mission
– Involve the worst offenders in the hypothesis team
– “Hold your friends close, and your enemies closer”
– Work WITH the developers to find solutions
– Ask Developers and IT for solutions, not apologies
@OptimiseOrDie
32. #5 : Product cycles are too long
0 6 12 18
Months
Conversion
@OptimiseOrDie
33. #5 : Solutions
• Give Priority Boarding for opportunities
– The best seats reserved for metric shifters
• Release more often to close the gap
– More testing resource helps, analytics „hawk eye‟
• Kaizen – continuous improvement
– Others call it JFDI (just f***ing do it)
• Make changes AS WELL as tests, basically!
– These small things add up
• RUSH Hair booking – Over 100 changes
– No functional changes at all – 37% improvement
• Inbetween product lifecycles?
– The added lift for 10 days work, worth 360k
@OptimiseOrDie
35. #6 – No Photo UX
24 Jan 2012
• Persuasion / Influence /
Direction / Explanation
• Helps people process
information and stories
• Vital to sell an „experience‟
• Helps people recognise and
discriminate between things
• Supports Scanning Visitors
• Drives emotional response
short.cx/YrBczl
36. • Very powerful and under-estimated area
• I‟ve done over 20M visitor tests with people
images for a service industry – some tips:
• The person, pose, eye gaze, facial
expressions and body language – cause
visceral emotional reactions and big changes
in behaviour
• Eye gaze crucial – to engage you or to „point‟
Photo UX
24 Jan 2012
37. • Negative body language is a turnoff
• Uniforms and branding a positive (ball cap)
• Hands are hard to handle – use a prop to help
• For Ecommerce – tip! test bigger images!
• Autoglass and Belron always use real people
• In most countries (out of 33) with strong female
and male images in test, female wins
• Smile and authenticity in these examples is
absolutely vital
• So, I have a question for you
Photo UX
@OptimiseOrDie
42. #7 : Your tests are cattle trucked
• Many tests fail due to QA or browser bugs
– Always do cross browser QA testing – see resources
• Don’t rely on developers saying ‘yes’
– Use your analytics to define the list to test
• Cross instrument your analytics
– You need this to check the test software works
• Store the variant(s) seen in analytics
– Compare people who saw A/B/A vs. A/B/B
• Segment your data to find variances
– Failed tests usually show differences for segments
• Watch the test and analytics CLOSELY
– After you go live, religiously check both
– Read this article : stanford.io/15UYov0
@OptimiseOrDie
43. #8 : Stats are confusing
• Many testers & marketing people struggle
– How long will it take to run the test?
– Is the test ready?
– How long should I keep it running for?
– It says it‟s ready after 3 days – is it?
– Can we close it now – the numbers look great!
• A/B testing maths for dummies:
– http://bit.ly/15UXLS4
• For more advanced testers:
– Read this : http://bit.ly/1a4iJ1H
• I’m going to build a stats course
– To explain all the common questions
– To save me having to explain this crap all the time
@OptimiseOrDie
44. #9 : You‟re not segmenting
• Averages lie
– What about new vs. returning visitors?
– What about different keyword groups?
– Landing pages? Routes? Attributes
• Failed tests are just ‘averaged out’
– You must look at segment level data
– You must integrate the analytics + a/b test software
• The downside?
– You‟ll need more test data – to segment
• The upside?
– Helps figure out why test didn‟t perform
– Finds value in failed or „no difference‟ tests
– Drives further testing focus
@OptimiseOrDie
45. #10 : You‟re unichannel optimising
• Not using call tracking
– Look at Infinity Tracking (UK)
– Get Google keyword level call volumes!
• You don’t measure channel switchers
– People who bail a funnel and call
– People who use chat or other contact/sales
• You ‘forget’ mobile & tablet journeys
– Walk the path from search -> ppc/seo -> site
– Optimise for all your device mix & journeys
• You’re responsive
– Testing may now bleed across device platforms
– Changing in one place may impact many others
– QA, Device and Browser testing even more vital
@OptimiseOrDie
46. SUMMARY : The best Companies….
• Invest continually in Analytics instrumentation, tools & people
• Use an Agile, iterative, Cross-silo, One team project culture
• Prefer collaborative tools to having lots of meetings
• Prioritise development based on numbers and insight
• Practice real continuous product improvement, not SLED
• Source photos and copy that support persuasion and utility
• Have cross channel, cross device design, testing and QA
• Segment their data for valuable insights, every test or change
• Continually try to reduce cycle (iteration) time in their process
• Blend ‘long’ design, continuous improvement AND split tests
• Make optimisation the engine of change, not the slave of ego
• See the Maturity Model in the resource pack
@OptimiseOrDie
47. So you want examples?
• Belron – Ed Colley
• Dell – Nazli Yuzak
• Shop Direct – Paul Postance (now with EE)
• Expedia – Oliver Paton
• Schuh – Stuart McMillan
• Soundcloud – Eleftherios Diakomichalis & Ole Bahlmann
• Gov.uk – Adam Bailin (now with the BBC)
Read the gov.uk principles : www.gov.uk/designprinciples
And my personal favourite of 2013 – Airbnb!
@OptimiseOrDie
48. • You‟re in the right place today!
• This work is rarely easy – it always involves doing
LOTS of things, not just one test a quarter
• Invest in people, tools, analytics, techniques but
most of all a process and strategy – cool tools are
not enough
• Stop putting things in the next release (JFDI)
• This is not a bolt-on – it IS the process
• Don‟t be afraid to fail – you‟re learning
• Be Brave, Be Bold and most importantly
• Never ever ever EVER give up
• Enjoy the wonderful lineup….
Don‟t panic!
49. Is there a way to fix this then?
49
Conversion
Heroes!
@OptimiseOrDie
52. RESOURCE PACK
• Maturity model
• Crowdsourced UX
• Collaborative tools
• Testing tools for CRO & QA
• Belron methodology example
• CRO and testing resources
52
53. Ad Hoc
Local Heroes
Chaotic Good
Level 1
Starter Level
Guessing
A/B testing
Basic tools
Analytics
Surveys
Contact Centre
Low budget
usability
Outline process
Small team
Low hanging fruit
+ Multi variate
Session replay
No segments
+Regular usability
testing/research
Prototyping
Session replay
Onsite feedback
________________________________________________________________________
_____________________ _
Dedicated team
Volume
opportunities
Cross silo team
Systematic tests
Ninja Team
Testing in the
DNA
Well developed Streamlined Company wide
+Funnel
optimisation
Call tracking
Some segments
Micro testing
Bounce rates
Big volume
landing pages
+ Funnel analysis
Low converting
& High loss pages
+ offline
integration
Single channel
picture
+ Funnel fixes
Forms analytics
Channel switches
+Cross channel
testing
Integrated CRO
and analytics
Segmentation
+Spread tool use
Dynamic adaptive
targeting
Machine learning
Realtime
Multichannel
funnels
Cross channel
synergy
________________________________________________________________________
_______________________
________________________________________________________________________
________________________
Testing
focus
Culture
Process
Analytics
focus
Insight
methods
+User Centered
Design
Layered feedback
Mini product tests
Get buyin
_________________________________________________________________________
_______________________Mission Prove ROI Scale the testing Mine value
Continual
improvement
+ Customer sat
scores tied to UX
Rapid iterative
testing and
design
+ All channel view
of customer
Driving offline
using online
All promotion
driven by testing
Level 2
Early maturity
Level 3
Serious testing
Level 4
Core business value
Level 5
You rock, awesomely
________________________________________________________________________
________________________
53
54. Som, feedbackRemote UX tools (P=Panel, S=Site recruited, B=Both)
Usertesting (B) www.usertesting.com
Userlytics (B) www.userlytics.com
Userzoom (S) www.userzoom.com
Intuition HQ (S) www.intuitionhq.com
Mechanical turk (S) www.mechanicalturk.com
Loop11 (S) www.loop11.com
Open Hallway (S) www.openhallway.com
What Users Do (P) www.whatusersdo.com
Feedback army (P) www.feedbackarmy.com
User feel (P) www.userfeel.com
Ethnio (For Recruiting) www.ethnio.com
Feedback on Prototypes / Mockups
Pidoco www.pidoco.com
Verify from Zurb www.verifyapp.com
Five second test www.fivesecondtest.com
Conceptshare www.conceptshare.com
Usabilla www.usabilla.com
2 - UX Crowd tools
54
60. • Lots of people don’t know this
• Serious time is getting wasted on pulling and preparing data
• Use the Google API to roll your own reports straight into Big G
• Google Analytics + API + Google docs integration = A BETTER LIFE!
• Hack your way to having more productive weeks
• Learn how to do this to make completely custom reports
3.5 - Google Docs and Automation
60
66. 5 – Méthodologies - Lean UX
Positive
– Lightweight and very fast methods
– Realtime or rapid improvements
– Documentation light, value high
– Low on wastage and frippery
– Fast time to market, then optimise
– Allows you to pivot into new areas
Negative
– Often needs user test feedback to
steer the development, as data not
enough
– Bosses distrust stuff where the
outcome isn’t known
“The application of UX design methods into product
development, tailored to fit Build-Measure-Learn cycles.”
66
67. 5 - Agile UX / UCD / Collaborative Design
Positive
– User centric
– Goals met substantially
– Rapid time to market (especially when
using Agile iterations)
Negative
– Without quant data, user goals can
drive the show – missing the business
sweet spot
– Some people find it hard to integrate
with siloed teams
– Doesn’t’ work with waterfall IMHO
Wireframe
Prototype
TestAnalyse
Concept
Research
“An integration of User Experience Design and Agile*
Software Development Methodologies”
*Sometimes
67
69. 5 - Lean Conversion Optimisation
Positive
– A blend of several techniques
– Multiple sources of Qual and Quant data aids triangulation
– CRO analytics focus drives unearned value inside all
products
Negative
– Needs a one team approach with a strong PM who is a
Polymath (Commercial, Analytics, UX, Technical)
– Only works if your teams can take the pace – you might be
surprised though!
“A blend of User Experience Design, Agile PM, Rapid Lean
UX Build-Measure-Learn cycles, triangulated data sources,
triage and prioritisation.”
69
71. 5 - Triage and Triangulation
• Starts with the analytics data
• Then UX and user journey walkthrough from SERPS -> key paths
• Then back to analytics data for a whole range of reports:
• Segmented reporting, Traffic sources, Device viewport and
browser, Platform (tablet, mobile, desktop) and many more
• We use other tools or insight sources to help form hypotheses
• We triangulate with other data where possible
• We estimate the potential uplift of fixing/improving something
as well as the difficulty (time/resource/complexity/risk)
• A simple quadrant shows the value clusters
• We then WORK the highest and easiest scores by…
• Turning every opportunity spotted into an OUTCOME
“This is where the smarts of CRO are – in identifying the
easiest stuff to test or fix that will drive the largest uplift.”
71
72. 5 - The Bucket Methodology
“Helps you to stream actions from the insights and prioritisation work.
Forces an action for every issue, a counter for every opportunity being lost.”
Test
If there is an obvious opportunity to shift behaviour, expose insight or
increase conversion – this bucket is where you place stuff for testing. If
you have traffic and leakage, this is the bucket for that issue.
Instrument
If an issue is placed in this bucket, it means we need to beef up the
analytics reporting. This can involve fixing, adding or improving tag or
event handling on the analytics configuration. We instrument both
structurally and for insight in the pain points we’ve found.
Hypothesise
This is where we’ve found a page, widget or process that’s just not working
well but we don’t see a clear single solution. Since we need to really shift
the behaviour at this crux point, we’ll brainstorm hypotheses. Driven by
evidence and data, we’ll create test plans to find the answers to the
questions and change the conversion or KPI figure in the desired direction.
Just Do It
JFDI (Just Do It) – is a bucket for issues where a fix is easy to identify or the
change is a no-brainer. Items marked with this flag can either be deployed
in a batch or as part of a controlled test. Stuff in here requires low effort
or are micro-opportunities to increase conversion and should be fixed.
Investigate You need to do some testing with particular devices or need more
information to triangulate a problem you spotted. If an item is in this
bucket, you need to ask questions or do further digging. 72
73. 5 - Belron example – Funnel replacement
Final
prototype
Usability
issues left
Final changes Release build
Legal review
kickoff
Cust services
review kickoff
Marketing
review
Test Plan
Signoff
(Legal, Mktng
, CCC)
Instrument
analytics
Instrument
Contact
Centre
Offline
tagging
QA testing
End-End
testing
Launch
90/10%
Monitor
Launch
80/20%
Monitor < 1
week
Launch
50/50%
Go live 100%
Analytics
review
Washup and
actions
New
hypotheses
New test
design
Rinse and
Repeat!
75. END SLIDES
75
Feel free to steal, re-use, appropriate or otherwise lift
stuff from this deck.
If it was useful to you – email me or tweet me and tell me
why – I‟d be DELIGHTED to hear!
Regards,
Craig.
Editor's Notes
This stuff is important. What do photographs do?Well they help me persuade people, influence their thinking, give them directions or cues and explain things – this is the scanning generation!And they’re very powerful when selling experiences, stories or using the power of social proofThey help people very quickly (more quickly than reading) discriminate, evaluate – work out what stuff is, how it’s organized, what the things are, what’s being shown to you.And most importantly, they drive emotional response in people. Whether you like being soggy, wet and without toilet paper for a 30 mile radius or not, a picture like this gets a RESPONSE! Work it!Lastly, a shout out to James Chudley, who’s book this example comes from.
So this is quite a powerful area – what about people images?I’ve tested quite a few of these – in over 20 countries and over 15 languages. What did I find?Well - the person, pose, eye gaze, facial expressions and body language – cause visceral emotional reactions and big changes in behaviour. The difference between a crappy image and one optimised to get the right response is huge.And one interesting thing Eye gaze is pretty crucial – to engage you, the viewer, or to ‘point’ or draw eye gaze and attention to a product. I’ve tested all angles of viewing and in these people images, the best view is straight at the viewer or slightly away. Any further and the conversion rate drops. It makes a difference I can count.
Tomorrow - Go forth and kick their flabby low converting asses
“A piece of paper with your design mockup. A customer in a shop or bookstore. Their finger is their mouse, the paper their screen. Where would they click? Do they know what these labels mean? Do they see the major routes out of the page? Any barriers.Congratulations, you just got feedback on your design, before writing a single freaking line of code or asking your developers to keep changing stuff.”
“A piece of paper with your design mockup. A customer in a shop or bookstore. Their finger is their mouse, the paper their screen. Where would they click? Do they know what these labels mean? Do they see the major routes out of the page? Any barriers.Congratulations, you just got feedback on your design, before writing a single freaking line of code or asking your developers to keep changing stuff.”
“A piece of paper with your design mockup. A customer in a shop or bookstore. Their finger is their mouse, the paper their screen. Where would they click? Do they know what these labels mean? Do they see the major routes out of the page? Any barriers.Congratulations, you just got feedback on your design, before writing a single freaking line of code or asking your developers to keep changing stuff.”