Toolkits and tips for UX analytics CRO by Craig Sullivan


Published on

UXPA UK March event - Optimising the User Experience

Published in: Technology
1 Like
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • And here’s a boring slide about me – and where I’ve been driving over 400M of additional revenue in the last few years. In two months this year alone, I’ve found an additional ¾ M pounds annual profit for clients. For the sharp eyed amongst you, you’ll see that Lean UX hasn’t been around since 2008. Many startups and teams were doing this stuff before it got a new name, even if the approach was slightly different. For the last 4 years, I’ve been optimising sites using the combination of techniques I’ll show you today.
  • These are the results of a live test on a site, where an artificial delay is introduced in the performance testing. I’ve done some testing like this myself on desktop and mobile sites and confirm this is true – you’re increasing bounce rate, decreasing conversion, site engagement…It doesn’t matter what metric you use, performance equals MONEY or if not measured, a HUGE LOSS.
  • Performance also harms the lifeblood of e-commerce and revenue generating websites – repeat visitors! The gap here in one second of delay is enormous over time. You’re basically sucking a huge portion of potential business out of your site, with every additional bit of waiting time you add.
  • These are all people on twitter who cover hybrid stuff – where usability, psychology, analytics and persuasive writing collide. If you follow this lot, you’ll be much smarter within a month, guaranteed.
  • And here are the most useful resources I regularly use or share with people. They have the best and most practical advice – cool insights but with practical applications.A special mention here to my friends at PRWD, who are one of the few companies blending Psychology, Split Testing and UX for superb gains in rapid time. Check out their resources section on their website.
  • So – what’s driving this change then? Well there have been great books on selling and persuading people – all the way back to ‘Scientific Advertising’ in 1923.And my favourite here is the Cialdini work – simply because it’s a great help for people to find practical uses for these techniques.I’ve also included some analytics and testing books here – primarily because they help so MUCH in augmenting our customer insight, testing and measurement efforts.There are lots of books with really cool examples, great stories and absolutely no fucking useful information you can use on your website – if you’ve read some of these, you’ll know exactly what I mean. These are the tomes I got most practical use from and I’d recommend you buy the whole lot – worth every penny.
  • Toolkits and tips for UX analytics CRO by Craig Sullivan

    1. 1. UX and CRO : Toolkits and Tips UXPA UK - 20th March 2014 @OptimiseOrDie
    2. 2. @OptimiseOrDie • UX and Analytics (1999) • User Centred Design (2001) • Agile, Startups, No budget (2003) • Funnel optimisation (2004) • Multivariate & A/B (2005) • Conversion Optimisation (2005) • Persuasive Copywriting (2006) • Joined Twitter (2007) • Lean UX (2008) • Holistic Optimisation (2009) Was : Group eBusiness Manager, Belron Now : Consulting
    3. 3. @OptimiseOrDie Became obsessed with UX Save the Trees Discovered UX Breaking the bonds Getting the mix right UX Hype Cycle
    4. 4. @OptimiseOrDie Timeline Tested stupid ideas, lots Most AB or MVT tests are bullshit Discovered AB testing Triage, Triangulation, Prioritisation, Maths Zen Plumbing AB Test Hype Cycle
    5. 5. @OptimiseOrDie Hands on!
    6. 6. Nice day at the office, dear? @OptimiseOrDie
    7. 7. Craig’s Cynical Quadrant Improves revenue Improves UX YesNo No Yes Client delighted (and fires you for another UX agency) Client fucking delighted Client absolutely fucking furious Client fires you (then wins an award for your work)
    8. 8. Top Tools & Tips #1 Get out of the office #2 Immerse yourself #3 Session Replay #4 Voice of Customer #5 Get the right inputs #6 Act like a P.I. #7 Experience testing #8 Split testing tools #9 Get performance #10 Analytics health check #11 Going agile Examples @OptimiseOrDie
    9. 9. #1 : GET OUT OF THE OFFICE @OptimiseOrDie
    10. 10. @OptimiseOrDie 1a : Lab Based Testing
    11. 11. @OptimiseOrDie 1b : Remote UX Testing 1 2 3 3 3 1 Moderator Participant2 Viewers3 3
    12. 12. Som, feedbackRemote UX tools (P=Panel, S=Site recruited, B=Both) Usertesting (B) Userlytics (B) Userzoom (S) Intuition HQ (S) Mechanical turk (S) Loop11 (S) Open Hallway (S) What Users Do (P) Feedback army (P) User feel (P) Ethnio (For Recruiting) Feedback on Prototypes / Mockups Pidoco Verify from Zurb Five second test Conceptshare Usabilla 13 @OptimiseOrDie 1c : Crowdsourced Testing
    13. 13. @OptimiseOrDie 1d : Beer, Caffeine and Work Breaks
    14. 14. DESKTOP & LAPTOP CamStudio (free) Mediacam AV (cheap) Silverback (Mac) Screenflow (Mac) MOBILE UX Recorder (iOS) Skype Hugging Reflection Reflector @OptimiseOrDie 1e : Guerrilla Testing
    15. 15. @OptimiseOrDie 1f : The Secret Millionaire • Tesco placed IT users in front line roles with product • You have to to create this kind of feedback loop • If it isn‟t there, you need to push/encourage • Connect the team with pain points AND outcomes of their work, split tests and changes • Hugely motivational strategy • One last tip – learn how to interview like a pro • Read these: “Don‟t Make Me Think” “Rocket Surgery Made Easy” “Talking to Customers” “Talking with Participants” “Don‟t listen to Users” “Interviewing Tips” “More interviewing Tips”
    16. 16. #2 : IMMERSE YOURSELF @OptimiseOrDie • Test ALL key campaigns • Use Real Devices • Get your own emails • Order your products • Call the phone numbers • Send an email • Send 11 shoes back • Be difficult • Break things • Experience the end-end • Do the same for competitors • Team are ALL mystery shoppers • Wear the magical slippers • Be careful about dogfood though!
    17. 17. • Vital for optimisers & fills in a ‘missing link’ for insight • Rich source of data on visitor experiences • Segment by browser, visitor type, behaviour, errors • Forms Analytics (when instrumented) are awesome • Can be used to optimise in real time! Session replay tools • Clicktale (Client) • SessionCam (Client) • Mouseflow (Client) • Ghostrec (Client) • Usabilla (Client) • Tealeaf (Hybrid) • UserReplay (Server) @OptimiseOrDie #3 : GET SESSION REPLAY
    18. 18. • Sitewide Omnipresent Feedback • Triggered (Behavioural) Feedback • Use of Features, Cancellation, Abandonment • 4Q Task Gap Analysis very good • Kampyle • Qualaroo • Feedback Daddy • 4Q • Usabilla #4 : GET THEIR VOICE
    19. 19. • Make contact and feedback easy & encouraged • Add contact & feedback to everything (e.g. all mails) • Read Caroline Jarrett, run surveys (remember them?) • Run regular NPS and behaviourally triggered surveys • Get ratings on Service Metrics • Find what drives the ‘level’ of delight • Ask your frequent, high spend, zealous users questions • Make the team spend ½ a day a month at the Call Centre • Meet with your Sales and Support teams ALL the time • Tip : Take them for Beers and encourage bitching #4 : GET THEIR VOICE
    20. 20. Insight - Inputs #FAIL Competitor copying Guessing Dice rolling An article the CEO read Competitor change Panic Ego Opinion Cherished notions Marketing whims Cosmic rays Not ‘on brand’ enough IT inflexibility Internal company needs Some dumbass consultant Shiny feature blindness Knee jerk reactons #5 : Your inputs are all wrong @OptimiseOrDie
    21. 21. Insight - Inputs Insight Segmentation Surveys Sales and Call Centre Session Replay Social analytics Customer contact Eye tracking Usability testing Forms analytics Search analytics Voice of Customer Market research A/B and MVT testing Big & unstructured data Web analytics Competitor evalsCustomer services #5 : These are the inputs you need… @OptimiseOrDie
    22. 22. • For your brand(s) and competitors • Check review sites, Discussion boards, News • Use Google Alerts on various brands & keywords • See what tools they’re using ( • Sign up for all competitor emails • Run Cross Competitor surveys • This was VITAL for LOVEFiLM • Use Social & Competitor Monitoring tools : #6 : ACT LIKE A PI
    23. 23. #4 – Test or Die! Email testing Browser testing Mobile devices @OptimiseOrDie #7 : MAKE MONEY FROM TESTING!
    24. 24. • Google Content Experiments • Optimizely • Visual Website Optimizer • Multi Armed Bandit Explanation • New Machine Learning Tools @OptimiseOrDie #8 : MAKE MORE MONEY FROM TESTING!
    25. 25. #9 – Fight! • Google PageSpeed Tools • • 26 @OptimiseOrDie
    26. 26. Site Size Requests The Daily Mail 4574k 437 Starbucks 1300k 145 Direct line 887k 45 Ikea (.se) 684k 14 Currys 667k 68 Marks & Spencers 308k 45 Tesco 234k 15 The Guardian 195k 35 BBC News 182k 62 Auto Trader 151k 47 Amazon 128k 16 Aviva 111k 18 Autoglass 25k 10 Real testing : @OptimiseOrDie
    27. 27. Slides : If you really care, download this deck: @OptimiseOrDie
    28. 28. Scare the Ecom or Trading director:
    29. 29. #10 : Your analytics tool is broken! @OptimiseOrDie
    30. 30. • Get a Health Check for your Analytics – Mail me for a free pack • Invest continually in instrumentation – Aim for at least 5% of dev time to fix + improve • Stop shrugging : plug your insight gaps – Change „I don‟t know‟ to „I‟ll find out‟ • Look at event tracking (Google Analytics) – If set up correctly, you get wonderful insights • Would you use paper instead of a till? – You wouldn‟t do it in retail so stop doing it online! • How do you win F1 races? – With the wrong performance data, you won‟t @OptimiseOrDie #10 : Your analytics tool is broken!
    31. 31. #11 : Go Agile @OptimiseOrDie
    32. 32. Methodologies - Lean UX Positive – Lightweight and very fast methods – Realtime or rapid improvements – Documentation light, value high – Low on wastage and frippery – Fast time to market, then optimise – Allows you to pivot into new areas Negative – Often needs user test feedback to steer the development, as data not enough – Bosses distrust stuff where the outcome isn’t known “The application of UX design methods into product development, tailored to fit Build-Measure-Learn cycles.” 33
    33. 33. Agile UX / UCD / Collaborative Design Positive – User centric – Goals met substantially – Rapid time to market (especially when using Agile iterations) Negative – Without quant data, user goals can drive the show – missing the business sweet spot – Some people find it hard to integrate with siloed teams – Doesn’t’ work with waterfall IMHO Wireframe Prototype TestAnalyse Concept Research “An integration of User Experience Design and Agile* Software Development Methodologies” *Sometimes 34
    34. 34. CRO 35
    35. 35. Lean Optimisation Positive – A blend of several techniques – Multiple sources of Qual and Quant data aids triangulation – Focus on priority opportunities drives unearned value and customer delight for all products Negative – Needs a one team approach with a strong PM who is a Polymath (Commercial, Analytics, UX, Technical) – Only works if your teams can take the pace – you might be surprised though! “A blend of User Experience Design, Agile PM, Rapid Lean UX Build-Measure-Learn cycles, triangulated data sources, triage and prioritisation.” 36
    36. 36. Lean CRO Inspection Immersion Identify Triage & Triangulate Outcome Streams Measure Learn Instrument 37
    37. 37. We believe that doing [A] for People [B] will make outcome [C] happen. We’ll know this when we observe data [D] and obtain feedback [E]. (reverse) @OptimiseOrDie
    38. 38. agile - Summary • Design your own methodology Experiment and optimise with your team • Don’t be a slave The methodology is the slave, not your master • Collaborative working – Harvard study into teams – it‟s an all the time thing • Ask me later… Questions – see me on Twitter, G+ or ask by mail @OptimiseOrDie
    39. 39. Scarcity principle... #1 : EXAMPLES
    40. 40. Scarcity principle... #1 : EXAMPLES
    41. 41. • 20M+ visitor tests with People Images • Some interesting stuff at Autoglass (Belron) • Negative body language is a turnoff • Uniforms and branding a positive (ball cap) • Eye gaze and smile are crucial • Hands are awkward without a prop • Best prop tested was a clipboard • Single image better than groups • In most countries (out of 33) with strong female and male images in test, female won • So – a question about this test @OptimiseOrDie #2 : SPLIT TESTING PEOPLE
    42. 42. @OptimiseOrDie
    43. 43. Terrible Stock Photos : & Laughing at Salads : Other Stock Memes : BBC Fake Smile Test : @OptimiseOrDie
    44. 44. UK Wave 2 - Isi
    45. 45. TV - Off TV - On Isi went on to star in the TV slot and helped Autoglass grow recruitment of female technicians, as well as proving a point! #3 : TV ADVERTISING
    46. 46. SPAIN +22% over control 99% confidence @OptimiseOrDie #3 : PHOTOGRAPHY UX
    47. 47. @OptimiseOrDie #4 : VOC, NPS, EXPERIMENTS • Belron NPS programme is huge • Millions of people every year, across the world • 35% survey takeup, 6% dropout rate! • (Try @lukew and @cjforms and @stickycontent) • Higher scores than some consumer products • Why? Measuring the drivers of delight • Even on A/B tests, we could split NPS data • We could see a new funnel drove a 5.5% rise • Lovefilm beat their competitors using NPS • How? Measuring key service metrics • Regression to find high value investment areas • Contact deflection using self service • Analytics, split testing, UX
    48. 48. How is it working out for Craig? • Methodologies are not Real Life ™ • It’s mainly about the mindset of the team and managers, not the tools or methodologies used • Not all my clients have all the working parts • Use some, any techniques instead of ‘guessing’ • Bringing together UX techniques with the excellent tools available – along with analytics investment - will bring you successful and well- loved products • Blending Lean and Agile UX with conversion optimisation techniques (analytics, split testing, Kaizen, Kano) is my critical insight from the last 5 years. • UX got hitched to numbers, they ran away and lived happily ever after 49
    49. 49. If it isn‟t working, you‟re not doing it right @OptimiseOrDie
    50. 50. Email Twitter : : @OptimiseOrDie : Slides on tonight!
    51. 51. RESOURCE PACK • Maturity model • Best CRO people on twitter • Best Web resources • Good recent books to read • Triage and Triangulation • The Bucket outcome methodology • Belron methodology example • CRO and testing resources • Companies and people to watch • Building a ring model • Manual Models for Analytics @OptimiseOrDie
    52. 52. Ad Hoc Local Heroes Chaotic Good Level 1 Starter Level Guessing A/B testing Basic tools Analytics Surveys Contact Centre Low budget usability Outline process Small team Low hanging fruit + Multi variate Session replay No segments +Regular usability testing/research Prototyping Session replay Onsite feedback ________________________________________________________________________ _____________________ _ Dedicated team Volume opportunities Cross silo team Systematic tests Ninja Team Testing in the DNA Well developed Streamlined Company wide +Funnel optimisation Call tracking Some segments Micro testing Bounce rates Big volume landing pages + Funnel analysis Low converting & High loss pages + offline integration Single channel picture + Funnel fixes Forms analytics Channel switches +Cross channel testing Integrated CRO and analytics Segmentation +Spread tool use Dynamic adaptive targeting Machine learning Realtime Multichannel funnels Cross channel synergy ________________________________________________________________________ _______________________ ________________________________________________________________________ ________________________ Testing focus Culture Process Analytics focus Insight methods +User Centered Design Layered feedback Mini product tests Get buyin _________________________________________________________________________ _______________________Mission Prove ROI Scale the testing Mine value Continual improvement + Customer sat scores tied to UX Rapid iterative testing and design + All channel view of customer Driving offline using online All promotion driven by testing Level 2 Early maturity Level 3 Serious testing Level 4 Core business value Level 5 You rock, awesomely ________________________________________________________________________ ________________________ 53
    53. 53. Best of Twitter @OptimiseOrDie @danbarker Analytics @fastbloke Analytics @timlb Analytics @jamesgurd Analytics @therustybear Analytics @carmenmardiros Analytics @davechaffey Analytics @priteshpatel9 Analytics @cutroni Analytics @Aschottmuller Analytics, CRO @cartmetrix Analytics, CRO @Kissmetrics CRO / UX @Unbounce CRO / UX @Morys CRO/Neuro @PeepLaja CRO @TheGrok CRO @UIE UX @LukeW UX / Forms @cjforms UX / Forms @axbom UX @iatv UX @Chudders Photo UX @JeffreyGroks Innovation @StephanieRieger Innovation @DrEscotet Neuro @TheBrainLady Neuro @RogerDooley Neuro @Cugelman Neuro
    54. 54. Best of the Web @OptimiseOrDie
    55. 55. Best of Books @OptimiseOrDie
    56. 56. Triage and Triangulation • Starts with the analytics data • Then UX and user journey walkthrough from SERPS -> key paths • Then back to analytics data for a whole range of reports: • Segmented reporting, Traffic sources, Device viewport and browser, Platform (tablet, mobile, desktop) and many more • We use other tools or insight sources to help form hypotheses • We triangulate with other data where possible • We estimate the potential uplift of fixing/improving something as well as the difficulty (time/resource/complexity/risk) • A simple quadrant shows the value clusters • We then WORK the highest and easiest scores by… • Turning every opportunity spotted into an OUTCOME “This is where the smarts of CRO are – in identifying the easiest stuff to test or fix that will drive the largest uplift.” @OptimiseOrDie
    57. 57. The Bucket Methodology “Helps you to stream actions from the insights and prioritisation work. Forces an action for every issue, a counter for every opportunity being lost.”  Test If there is an obvious opportunity to shift behaviour, expose insight or increase conversion – this bucket is where you place stuff for testing. If you have traffic and leakage, this is the bucket for that issue.  Instrument If an issue is placed in this bucket, it means we need to beef up the analytics reporting. This can involve fixing, adding or improving tag or event handling on the analytics configuration. We instrument both structurally and for insight in the pain points we’ve found.  Hypothesise This is where we’ve found a page, widget or process that’s just not working well but we don’t see a clear single solution. Since we need to really shift the behaviour at this crux point, we’ll brainstorm hypotheses. Driven by evidence and data, we’ll create test plans to find the answers to the questions and change the conversion or KPI figure in the desired direction.  Just Do It JFDI (Just Do It) – is a bucket for issues where a fix is easy to identify or the change is a no-brainer. Items marked with this flag can either be deployed in a batch or as part of a controlled test. Stuff in here requires low effort or are micro-opportunities to increase conversion and should be fixed.  Investigate You need to do some testing with particular devices or need more information to triangulate a problem you spotted. If an item is in this bucket, you need to ask questions or do further digging. 58
    58. 58. 5 - Belron example – Funnel replacement Final prototype Usability issues left Final changes Release build Legal review kickoff Cust services review kickoff Marketing review Test Plan Signoff (Legal, Mktng, CCC) Instrument analytics Instrument Contact Centre Offline tagging QA testing End-End testing Launch 90/10% Monitor Launch 80/20% Monitor < 1 week Launch 50/50% Go live 100% Analytics review Washup and actions New hypotheses New test design Rinse and Repeat!
    59. 59. CRO and Testing resources • 101 Landing page tips : • 544 Optimisation tips : • 108 Optimisation tips : • 32 CRO tips : • 57 CRO books : • CRO article list : • Smashing Mag article : @OptimiseOrDie
    60. 60. So you want examples? Examples of companies putting this stuff together in a good way: • Belron – Ed Colley • Dell – Nazli Yuzak • Shop Direct – Paul Postance (now with EE) • Expedia – Oliver Paton • Schuh – Stuart McMillan • Soundcloud – Eleftherios Diakomichalis & Ole Bahlmann • – Adam Bailin (now with the BBC) Read the principles : And my personal favourite of 2013 – Airbnb!@OptimiseOrDie
    61. 61. #1 Building a Model #1 Avinash article #2 The Ring Model #3 3 examples #4 Benefits #5 Summary 62
    62. 62. 6.1 – Avinash “See-Think-Do” • Avinash Kaushik, analytics guru, proposes a very nice model for marketing. A brilliant article can be read here: • business-framework/ • But this sort of thinking is also relevant to optimisation • CRO often focuses on purely the ‘Do’ stage – rather than ‘See’ or ‘Think’ stages. 63
    63. 63. 6.1 – Example 64
    64. 64. 6.2 – The Ring Model • Simply looking at conversion points is not enough • We need a way to look at the ‘layers’ or ‘levels’ reached • So I developed a ring or engagement model • This works for many (but not all) websites • Focuses on depth of engagement, not pages viewed • Helps to see the key loss steps, like a funnel • It’s not a replacement for funnel diagrams • It helps to see the ‘big picture’ involved • So – let’s try some examples 65
    65. 65. 6.3 – Examples – Concept 66 Bounce Engage Outcome
    66. 66. 6.3 – Examples – 67 Bounce Search or Category Product Page Add to basket View basket Checkout Complete
    67. 67. 6.3 – Examples – 68 Bounce Login to Account Content Engage Start Application Type and Details Eligibility Photo Complete
    68. 68. 6.3 – Examples – Guide Dogs 69 Bounce Content Engage Donation Pathway Donation Page Starts process Funnel steps Complete
    69. 69. 6.3 – Within a layer 70 Page 1 Page 2 Page 3 Page 4 Page 5 Exit Deeper Layer Email LikeContact Wishlist Micro Conversions
    70. 70. 6.4 – Benefits • Helps you see where flow is ‘stuck’ • Sorts out small opportunities from big wins • Ignores pages in favour of ‘Macro’ and ‘Micro’ conversions • Lets you show the client where focus should be • Helps flush flow or traffic through to lower levels • Avoids prioritising the wrong part of the model! • Example – Shoprush problem is basket adds, not checkout • If you had 300k product page views, 5k adds and 1k checkouts – where would your problem be? • If you had 300k product page views, 100k adds and 1k checkouts – it’s a different place! • Example – Google adwords site has bad traffic, not conversion 71
    71. 71. 6.5 – Benefits contd. • A nice simple way to visualise complex websites • Does not rely on pages – more ‘steps’ or ‘layers’ • Helps you see where traffic is ‘stuck’ or ‘failing to engage more deeply’ • The combination of traffic potential, UX and persuasion issues combines to identify opportunity • Avoids visual bias when doing an expert review • In the e-commerce example, Rush have optimised product page first, not homepage. • Questions? 72
    72. 72. QUESTIONS? 73
    73. 73. #5 By Hand Analytics #1 When to use this method? #2 How to use it #3 Demo #4 Limitations 74
    74. 74. 5.1 – When to use this method • If goals are unreliable / broken / have no data • If flows are mixed in funnels (mid stage joiners) • If the conceptual model does not match site config • When the data you need does not exist 75
    75. 75. 5.2 – How to use this method • For example, with a funnel • Use UNIQUE PAGEVIEWS (and events, if available) • Do NOT mix with pageviews or visitor counts • Step 1 – Basket UPVs • Step 2 – Customer details • Step 3 – Shipping • Step 4 – Payment • Step 5 – Thank you • Use regex / advanced segments to aggregate or filter • Gives you a unique count of people at steps • Always be aware of time periods! 76
    76. 76. 5.3 – Limitations & Benefits • Mixing and matching data can look nice but causes issues • Time consuming and more complex • Try to use in-page filters not advanced segments (sampling) • Is not readily replayed by client Some benefits though: • Construct segmented funnels • Split by other data attributes • Very good way to spot variances inside funnels • Vital for multi-device category websites 77
    77. 77. END SLIDES 78 Feel free to steal, re-use, appropriate or otherwise lift stuff from this deck. If it was useful to you – email me or tweet me and tell me why – I‟d be DELIGHTED to hear! Regards, Craig.