10 Top CRO Questions, Tips and Toolkits : eMetrics San Francisco - 17 April 2013


Published on

I'm presenting at eMetrics here in San Francisco but also attended a lot of sessions myself. I listened to the questions and included the most common CRO niggles or issues as part of my deck. It also includes details of analytics tips (particularly Google analytics but can be applied elsewhere) and details of a good CRO toolkit to have in your pocket.

Lastly, I've included a BONUS DECK here. Yay. You'll find some details on the methodologies I support and some of the results that happened. I love lean techniques blended with rapid support from analytics, split testing, rapid UX techniques and many other toolsets.

Published in: Technology
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • “A piece of paper with your design mockup. A customer in a shop or bookstore. Their finger is their mouse, the paper their screen. Where would they click? Do they know what these labels mean? Do they see the major routes out of the page? Any barriers.Congratulations, you just got feedback on your design, before writing a single freaking line of code or asking your developers to keep changing stuff.”
  • “A piece of paper with your design mockup. A customer in a shop or bookstore. Their finger is their mouse, the paper their screen. Where would they click? Do they know what these labels mean? Do they see the major routes out of the page? Any barriers.Congratulations, you just got feedback on your design, before writing a single freaking line of code or asking your developers to keep changing stuff.”
  • Create a suck index = pageviews * load time.
  • Here I show you some examples of well known brands, some of whom should know better. The larger the size of the page, the longer it will take to download and render on the device, especially when you don’t have perfect data conditions. The numberof requests also makes a difference, as it’s inefficient on mobile to open lots of connections like this. In short, the smaller the pagesize and number of requests you can aim for, the better. I’m patient with bad data connections but do people have the tolerance for 10-15 seconds on mobile? No – it has to happen much faster.
  • These are the results of a live test on a site, where an artificial delay is introduced in the performance testing. I’ve done some testing like this myself on desktop and mobile sites and confirm this is true – you’re increasing bounce rate, decreasing conversion, site engagement…It doesn’t matter what metric you use, performance equals MONEY or if not measured, a HUGE LOSS.
  • Performance also harms the lifeblood of e-commerce and revenue generating websites – repeat visitors! The gap here in one second of delay is enormous over time. You’re basically sucking a huge portion of potential business out of your site, with every additional bit of waiting time you add.
  • Add unique phone numbers to all your mobile sites and apps. That’s for starters.Then configure your analytics to collect data when people Click or Tap to make a phone call.Make sure you add other events like ringbacks, email, chat – any web forms or lead gen activity too.
  • So what does this graph say? That I have a long tail thing I want to talk to you about?No – this shows how much the ratio of phone to online conversion we have, by keyword.Some keywords generate nearly 25 times the call volume of others, which is a huge differential.This means that if you thought you got ‘roughly’ the same proportion of phone calls for different marketing activity, you are wrong.What this graph tells me is that the last 2 years of my stats are basically a big dog poo.
  • Add unique phone numbers to all your mobile sites and apps. That’s for starters.Then configure your analytics to collect data when people Click or Tap to make a phone call.Make sure you add other events like ringbacks, email, chat – any web forms or lead gen activity too.
  • Phone tracking costs you nothing – you can add it in a few minutes to your app or mobile website, by changing your analytics tracking.Now you can see exactly which bits of inbound marketing are driving telephone and other contact channelsIf you have any sort of phone component in your service or support, the insight could be vitalYou can take traffic by keyword, source, campaign or advert creative and work out the TRUE mix of conversion activityAnd all this is also available on Desktop too – by using dynamic numbers, we can track exactly the same stuff.Talk to this company : www.infinity-tracking.com
  • So what does this graph say? That I have a long tail thing I want to talk to you about?No – this shows how much the ratio of phone to online conversion we have, by keyword.Some keywords generate nearly 25 times the call volume of others, which is a huge differential.This means that if you thought you got ‘roughly’ the same proportion of phone calls for different marketing activity, you are wrong.What this graph tells me is that the last 2 years of my stats are basically a big dog poo.
  • “A piece of paper with your design mockup. A customer in a shop or bookstore. Their finger is their mouse, the paper their screen. Where would they click? Do they know what these labels mean? Do they see the major routes out of the page? Any barriers.Congratulations, you just got feedback on your design, before writing a single freaking line of code or asking your developers to keep changing stuff.”
  • 10 Top CRO Questions, Tips and Toolkits : eMetrics San Francisco - 17 April 2013

    1. 1. Confessions of a Split Tester@OptimiseOrDie 1
    2. 2. "If you go to the mens washrooms atthe Schiphol airport inAmsterdam, you may notice theres afly in the urinals. It’s screen printed .So what do you think most men do?Thats right, they aim at the fly whenthey urinate.They dont even think about it, andthey dont need to read a usersmanual; its just an instinctivereaction that means 85% less spillage!The interesting feature of theseurinals is that theyre deliberatelydesigned to take advantage of thisinherent human male tendency.“This is my job.2
    3. 3. Director of Optimization, RUSH HairBuilding a team, conversion rate or methodology?Shameless Promotion Slide• 35M visitor split testsand counting• Over $400M increasesin revenue forclients, within 5 yrs• Lifts from 12% to200+% in site-wideconversion rates• 21 years of my lifeslowly being suckedaway in boringmeetings with timewasting morons.• UX and Analytics (1999)• User Centred Design (2001)• Startups and advisory (2003)• Funnel optimisation (2004)• Multivariate & A/B (2005)• Lean UX (2008)• Holistic Optimisation (2009)• I love ooptimizingunderperforming stuff :websites, teams, businessesand multi-countryoptimisation programs!@OptimiseOrDie3
    4. 4. Is there a way to fix this then?4
    5. 5. Agenda#1 The Optimisers ToolkitThe best tools recommended by CRO practitioners#2 Analytics Genius TipsTop tips from 2013 – from around the world#3 Top CRO questionsSome answers from my Top 30 CRO questions5
    6. 6. The Optimisers Toolkit#1 Session Replay#2 Browser & Email testing#3 VOC, Survey & Feedback tools#4 Guerrilla Usability#5 Productivity tools#6 Split testing#7 Performance#8 Crowdsourcing#9 Analytics Love 6
    7. 7. #1 : Session Replay• 3 kinds of tool :Client side• Normally Javascript based• Pros : Rich mouse and click data,errors, forms analytics, UI interactions.• Cons : Dynamic content issue, Performance hitServer side• Black Box -> Proxy, Sniffer, Port copying device• Pros : Gets all dynamic content, fast, legally tight• Cons : No client side interactions, Ajax, HTML5 etc.Hybrid• Clientside and Sniffing with central data store7
    8. 8. #1 : Session Replay• Vital for optimisers & fills in a ‘missing link’ for insight• Rich source of data on visitor experiences• Segment by browser, visitor type, behaviour, errors• Forms Analytics (when instrumented) are awesome• Can be used to optimise in real time!Session replay tools• Clicktale (Client) www.clicktale.com• SessionCam (Client) www.sessioncam.com• Mouseflow (Client) www.mouseflow.com• Ghostrec (Client) www.ghostrec.com• Usabilla (Client) www.usabilla.com• Tealeaf (Hybrid) www.tealeaf.com• UserReplay (Server) www.userreplay.com 8
    9. 9. 9
    10. 10. 10
    11. 11. 11
    12. 12. #2 : Feedback / VOC tools• Anything that allows immediate realtime onpage feedback• Comments on elements, pages and overall site & service• Can be used for behavioural triggered feedback• Tip! : Take the Call Centre for beers• Kampylewww.kampyle.com• Qualaroowww.qualaroo.com• 4Q4q.iperceptions.com• Usabillawww.usabilla.com12
    13. 13. #2a : Survey Tools• Surveymonkey www.surveymonkey.com (1/5)• Zoomerang www.zoomerang.com (3/5)• SurveyGizmo www.surveygizmo.com (5/5)• For surveys, web forms, checkouts, lead gen – anything withform filling – you have to read these two:Caroline Jarrett (@cjforms)Luke Wroblewski (@lukew)• With their work and copywriting from @stickycontent, Imanaged to get a survey with a 35% clickthrough from emailand a whopping 94% form completion rate.• Their awesome insights are the killer app I have whenoptimising forms and funnel processes for clients.13
    14. 14. #3 – Testing toolsEmail testing www.litmus.comwww.returnpath.comwww.lyris.comBrowser testing www.browsercam.com (BOING!)www.crossbrowsertesting.comwww.cloudtesting.comwww.multibrowserviewer.comwww.saucelabs.comMobile devices www.perfectomobile.comwww.deviceanywhere.comwww.mobilexweb.com/emulatorswww.opendevicelab.com 14
    15. 15. #4 : Guerrilla Usability Testing• All you need is a device, time and people!• Use one of these tools for session recording:CamStudio (free)www.camstudio.orgMediacam AV (cheap)www.netu2.co.ukSilverback (Mac)www.silverbackapp.comScreenflow (Mac)www.telestream.netUX Recorder (iOS), Reflection, Webcamwww.uxrecorder.com & bit.ly/tesTfm & bit.ly/GZMgxR 15
    16. 16. #5 : Productivity toolsOh sh*t16
    17. 17. #5 Join.me17
    18. 18. #5 Pivotal Tracker18
    19. 19. #5 Basecamp19
    20. 20. • Seriously wasting your time doing manual Excel?• Fed up doing stuff that takes hours?• Use the Google API to roll your own reports straight into Big G• Lots of good articles but ask for advice from:@danbarker@timlb#measurecamp• Google Analytics + API + Google docs integration = A BETTER LIFE!• Hack your way to having more productive weeks• Learn how to do this and to have fun with GA custom reports• Ask me about the importance of training#5 Google Docs20
    21. 21. • LucidChart#5 Cloud Collaboration21
    22. 22. • Webnotes#5 Cloud Collaboration22
    23. 23. • Protonotes#5 Cloud Collaboration23
    24. 24. • Conceptshare#5 Cloud Collaboration24
    25. 25. #6 – Split testing tools – Cheap!• Google Content Experimentsbit.ly/Ljg7Ds• Multi Armed Bandit Explanationbit.ly/Xa80O8• Optimizelywww.optimizely.com• Visual Website Optimizerwww.visualwebsiteoptimizer.com25
    26. 26. #7 Performance• Google Site Speed• Webpagetest.org• Mobitest.akamai.org26
    27. 27. Site Size Requestswww.coop.se 1200k 63m.ikea.com/se/sv/ 684k 14High St Retailer 307k 43Department Store 100k 18Newspaper 195k 35Supermarket 125k 14Auto Sales 151k 47Autoglass 25k 10Remote tests : iPhone
    28. 28. Slides : slidesha.re/PDpTPDShow your boss this split test:
    29. 29. Show the e-com director this one:
    30. 30. Som, feedbackRemote UX tools (P=Panel, S=Site recruited, B=Both)Usertesting (B) www.usertesting.comUserlytics (B) www.userlytics.comUserzoom (S) www.userzoom.comIntuition HQ (S) www.intuitionhq.comMechanical turk (S) www.mechanicalturk.comLoop11 (S) www.loop11.comOpen Hallway (S) www.openhallway.comWhat Users Do (P) www.whatusersdo.comFeedback army (P) www.feedbackarmy.comUser feel (P) www.userfeel.comEthnio (For Recruiting) www.ethnio.comFeedback on Prototypes / MockupsPidoco www.pidoco.comVerify from Zurb www.verifyapp.comFive second test www.fivesecondtest.comConceptshare www.conceptshare.comUsabilla www.usabilla.com#7 – UX and Crowd tools30
    31. 31. #8 : Web Analytics Love• Properly instrumented analytics• Investment of 5-10% of developer time• Add more than you need• Events insights• Segmentation• Call tracking love!31
    32. 32. #8 : Tap 2 Call trackingStep 1 : Add a unique phone number on ALL channels(or insert your own dynamic number)Step 2 : For phones, add “Tap to Call” or “Click to Call”• Add Analytics event or tag for phone calls!• Very reliable data, easy & cheap to do• What did they do before calling?• Which page did they call you from?• What PPC or SEO keyword did they use?• Incredibly useful – this keyword level call data• What are you over or underbidding for?• Will help you shave 10, 20%+ off PPC• Which online marketing really sucks?32
    33. 33. safelitwindshieldchiprepairsafelitewindshieldautoglassautowindowreplacementsafelightautoautoglassrepairwindshieldreplacementcostssafeliteautoglasssafeautoglasssafeliterepairwindshieldautoglassreplacementcarglasssafelitelocationsautoglassrepairquoteswindshieldrepairmobilewindshieldreplacementreplacewindshieldcarwindshieldrepairnewwindshieldcostautoglasswindshieldreplacementcarwindowrepaircostsafeglautoglassrepairhoustontxwindshieldcrackPhone to Booking Ratio33
    34. 34. What about desktop?Step 1 : Add ‘Click to reveal’• Can be a link, button or a collapsed section• Add to your analytics software• This is a great budget option!Step 2 : Invest in call analytics• Unique visitor tracking for desktop• Gives you that detailed marketing data• Easy to implement• Integrates with your web analytics• Let me explain…34
    35. 35. So what does phone tracking get you?• You can do it for free on your online channels• If you’ve got any phone sales or contact operation, this willchange the game for you• For the first time, analytics for PHONE for web to claim• Optimise your PPC spend• Track and Test stuff on phones, using web technology• The two best phone A/B tests? You’ll laugh!35
    36. 36. Who?Company Website CoverageMongoose Metrics* www.mongoosemetrics.com UK, USA, CanadaIfbyphone* www.ifbyphone.com USATheCallR* www.thecallr.com USA, Canada, UK, IT, FR, BE, ES, NLCall tracking metrics www.calltrackingmetrics.com USAHosted Numbers www.hostednumbers.com USACallcap www.callcap.com USAFreespee* www.freespee.comUK, SE, FI, NO, DK, LT, PL, IE, CZ,SI, AT, NL, DEAdinsight* www.adinsight.co.uk UKInfinity tracking* www.infinity-tracking.com UKOptilead* www.optilead.co.uk UKSwitchboard free www.switchboardfree.co.uk UKFreshegg www.freshegg.co.uk UKAvanser www.avanser.com.au AUSJet Interactive* www.jetinteractive.com.au AUS* I read up on these or talked to them. These are my picks.36
    37. 37. 37
    38. 38. #9 : Web Analytics Love• People, Process, Human problems• UX of web analytics tools and reports• Make the UI force decisions!• Playability and exploration• Skunkworks project time (5-10%)• Give it love, time, money and iteration• How often do you iterate analytics?• Lastly, spend to automate, gain MORE time38
    39. 39. #3 2013 Tips roundup!!!!!!!39
    40. 40. Analytics Genius Tips#1 Performance tune-ups#2 Browser money#3 Keyboard shortcuts#4 Ranking data#5 Content engagement#6 Enhanced In page#7 Duplicate transactions#8 Event tracking#9 Google API + more 40
    41. 41. #1 : Performance Tune-upsWith thanks to @DanbarkerAdd “_gaq.push([_setSiteSpeedSampleRate, 100]);”• Amps up sampling for small & medium websites• Use the distribution report - % of pages < 3 seconds• DOM timings vital – let me explain• Avg. Document Content Loaded Time (sec)• This data is very accurate and helps conversion – pretty vital for landingpages, where I find lots of stuff• Make yourself a [pageview * content load time] report• This is called a ‘Suck Index’• Work your way down from the top• Mobile speed doesn’t count Safari – please be careful!• Read more at:http://p.barker.dj/sitespeedtips 41
    42. 42. #2 : Browser money• Create a desktop only segment *Exclude “mobile including tablets” +• Create a mobile only segment [ For GA, Include Mobile (including tablet) &Exclude Tablet = yes]. 3 segments Desktop, Tablet, Mobile only• Now start segmenting like this:Browser Segment Conv rateSafari Mobile Traffic 0.79%Internet Explorer Desktop only 1.34%Chrome Desktop only 1.01%Safari Tablet Traffic 1.00%Safari Desktop only 1.28%Firefox Desktop only 1.20%Android Browser Mobile Traffic 0.31%Safari (in-app) Mobile Traffic 0.69%Chrome Mobile Traffic 0.62%Safari (in-app) Tablet Traffic 0.89%Chrome Tablet Traffic 0.84%Opera Desktop only 0.10%Android Browser Tablet Traffic 0.71%Mozilla Compatible Agent Mobile Traffic 0.51%Quote theopportunity!IE8 = 1.41% revenueIE8 converts at 20%of IE9 and IE10Some nasty bugscause the problemWorth fixing?That 1.41% problemis worth nearly 6%more checkouts! 42
    43. 43. #2 : Browser money• You should see something like this:43
    44. 44. #3 : GA Keyboard Shortcuts• Thanks to Farid Alhadi and @fastbloked t Set date range to TODAYd y Set date range to YESTERDAYd w Set date range to LAST WEEKd m Set date range to LAST MONTHd cToggle date comparison mode (to the previous period of whatever you are looking at.Example, if you’re looking at 6 days, this will compare it to the 6 days before it)d x Toggle date comparison mode (to the previous year of the period you are looking at)? Open keyboard shortcut helph Search help centera open account panelshift + a Go to account lists / Search reportsshift + d Go to the default dashboard of the current profile44
    45. 45. #4 : Ranking data• Someone searches on Google for ‘Term’• Clicks on a link to your site• What was the keyword rank for that term?• Reverse engineer the actual rankings from users machines• More accurate than some SEO tools (IMHO)• Two articles to show you how:http://bit.ly/Vaisnohttp://bit.ly/13lmYF245
    46. 46. #5 : Content Engagement• Get a better bounce rate metric• See more detailed engagement metrics• Measure scrolling and reading activity• Came from @fastbloke but originally Justin Cutroni• Measure your scrolling and exit points from long formcontent• Very nice technique – read more at:http://bit.ly/13lmYF2http://goo.gl/1AZZb46
    47. 47. #6 : In Page Analytics - Enhancedvar _gaq = _gaq || [];var pluginUrl =//www.google-analytics.com/plugins/ga/inpage_linkid.js;_gaq.push([_require, inpage_linkid, pluginUrl]);_gaq.push([_setAccount, UA-XXXXXX-Y]);_gaq.push([_trackPageview]);• Correct link attribution for in-page Analytics• Very nice47
    48. 48. #7 : Duplicate transactions in GA• Thanks to Matt Clarke and @timlb• Stop skewing the data with duplicates/reloads• Custom report to check if you’re affectedhttp://techpad.co.uk/content.php?sid=247• I’ve seen this a few places so worthchecking, particularly if figures don’t tally!• Read more at : http://bit.ly/13lmYF248
    49. 49. #8 : Event tracking• Thanks to #Measurecamp – check the stream• A beginners guide : http://bit.ly/13RFoJs• Some great ideas here : http://bit.ly/UCcptx• Don’t go for ‘Event Blizzard’• Focus on specific areas where insight is needed• Choose your naming structure carefully:http://bit.ly/WJ4R4c• Read this complete guide : http://bit.ly/VmFSJ449
    50. 50. #9 : Google API and more• Use the Google API to get super custom reports• You can fetch different data types (on the fly aswell as pre-calculated)• Automate a HUGE CHUNK of Excel work• @timlb recorded the #MeasureCamp session:• Deck :www.youtube.com/watch?v=JWXg1_4quwU• Roundup :www.measurecamp.org/aftermath/50
    51. 51. • http://www.seoskeptic.com/beyond-rich-snippets-semantic-web-technologies-for-better-seo/• http://www.slideshare.net/ismepete/maximising-your-serp-potential-enhance-your-listings-with-rich-snippetsThis is funnel step ZERO – work it!#10 : Microdata / Rich Snippets51
    52. 52. #10 : Microdata – SERPS UX• Reviews – huge increases in CTR and Conversion• People (Authors)• Products• Businesses and Organisations• Recipes• Events• Music• Local• VideoHelps to:• Dominate the page• Push other stuff down• Makes it more persuasive• The conversion journey starts here! 52
    53. 53. #11 : Measure viewport sizeThanks to @Beantin and others!• Measure the viewport size, not the resolution• Why?• Toolbars, chrome and setup varies• UK 2011 figure was 2.2 toolbars• Code example here : http://bit.ly/4xaNYK• A common conversion issue• Your desk vs. Users = different• Turn off the wifi, reduce the viewport• The budget restroom solution53
    54. 54. Best Practice?• There is no such thing as a ‘readily repeatablebest practice’ in conversion optimisation• The button color example• There are patterns! - but the context varies• The answer is always, “it depends” ;-)• It starts with your customers, your site, yourdata, your insights – not an article online!• It starts and end with customer knowledge –that’s best practice!54
    55. 55. Top Conversion Questions• 32 questions, picked by Practitioners• Being recorded on ScreenR.com• What top stuff did I hear this week?“How long will my testtake?”“When should I checkthe results?”“How do I know if it’sready?” 55
    56. 56. #1 How long will a test take?• The minimum length– 2 business cycles– Always test ‘’whole’ not partial cycles– Usually a week, 2 weeks, Month– Be aware of multiple cycles• How long after that– IMHO you’ll need a minimum 250 outcomes, ideally 350 for each ‘creative’– If you test 4 recipes, that’s 1400 outcomes– Make a note of your minimum ‘length’ for 350 outcomes– If you segment, you’ll need more data– It may take longer than that if the response rates are similar*– Work out how long it might take (or you can afford it to take)http://visualwebsiteoptimizer.com/ab-split-test-duration/* Stats geeks know I’m glossing over something here. That test time depends on howthe two experiments separate in terms of relative performance as well as howvolatile the test response is. I’ll talk about this when I record this one! This is whytesting similar stuff sux. 56
    57. 57. #2 – Are we there yet? Early test stages…• Ignore the graphs. Don’t draw conclusions. Don’t dance. Calm down.• Get a feel for the test but don’t do anything yet!• Remember – in A/B - 50% of returning visitors will see a new shiny website!• Until your test has had at least 1 business cycle and 250-350 outcomes, don’tbother drawing conclusions or getting excited!• You’re looking for anything that looks really odd – your analytics person should bechecking all the figures until you’re satisfied• All tests move around or show big swings early in the testing cycle. Here is a veryhigh traffic site – it still takes 10 days to start settling. Lower traffic sites willstretch this period further.57
    58. 58. #3 – What happens when a test flips on me?• Something like this can happen:• Check your sample size. If it’s still small, then expect this until the test settles.• If the test does genuinely flip – and quite severely – then something has changed withthe traffic mix, the customer base or your advertising. Maybe the PPC budget ranout? Seriously!• To analyse a flipped test, you’ll need to check your segmented data. This is why youhave a split testing package AND an analytics system.• The segmented data will help you to identify the source of the shift in response to yourtest. I rarely get a flipped one and it’s always something changing on me, withoutbeing told. The heartless bastards.58
    59. 59. #4 – What happens if a test is still moving around?• There are three reasons it is moving around– Your sample size (outcomes) is still too small– The external traffic mix, customers or reaction hassuddenly changed or– Your inbound marketing driven traffic mix iscompletely volatile (very rare)• Check the sample size• Check all your marketing activity• Check the instrumentation• If no reason, check segmentation59
    60. 60. #5 – How do I know when it’s ready?• The hallmarks of a cooked test are:– It’s done at least 1 or 2 (preferred) cycles– You have at least 250-350 outcomes for each recipe– It’s not moving around hugely at creative or segment levelperformance– The test results are clear – even if the precise values are not– The intervals are not overlapping (much)– If a test is still moving around, you need to investigate– Always declare on a business cycle boundary – not the middle ofa period (this introduces bias)– Don’t declare in the middle of a limited time period advertisingcampaign (e.g. TV, print, online)– Always test before and after large marketing campaigns (oneweek on, one week off)60
    61. 61. #6 – What happens if it’s inconclusive?• Analyse the segmentation• One or more segments may be over and under• They may be cancelling out – the average is a lie• The segment level performance will help you(beware of small sample sizes)• If you genuinely have a test which failed to move anysegments, it’s a crap test• This usually happens when it isn’t bold or braveenough in shifting away from the original design,particularly on lower traffic sites• Get testing again!61
    62. 62. #7 – What QA testing should I do?• Cross Browser Testing• Testing from several locations (office, home, elsewhere)• Testing the IP filtering is set up• Test tags are firing correctly (analytics and the test tool)• Test as a repeat visitor and check session timeouts• Cross check figures from 2+ sources• Monitor closely from launch, recheck62
    63. 63. #8 – What happens if it fails?• Learn from the failure• If you can’t learn from the failure, you’ve designed a crap test.Next time you design, imagine all your stuff failing. What wouldyou do? If you don’t know or you’re not sure, get it changed sothat a negative becomes useful.• So : failure itself at a creative or variable level should tell yousomething.• On a failed test, always analyse the segmentation• One or more segments will be over and under• Check for varied performance• Now add the failure info to your Knowledge Base:• Look at it carefully – what does the failure tell you? Whichelement do you think drove the failure?• If you know what failed (e.g. making the price bigger) then youhave very useful information• You turned the handle the wrong way• Now brainstorm a new test63
    64. 64. #9 – Should I run an A/A test first?• No – and this is why:– It’s a waste of time– It’s easier to test and monitor instead– You are eating into test time– Also applies to A/A/B/B testing– A/B/A running at 25%/50%/25% is the best• Read my post here :http://bit.ly/WcI9EZ64
    65. 65. #10 – What is a good conversion rate?Higher than the oneyou had last month!65
    66. 66. EmailTwitterSlideshare: sullivac@gmail.com: @OptimiseOrDie: linkd.in/pvrg14: slidesha.re/nlCDm6More reading. Slides and resources on slideshare.net66
    67. 67. RESOURCE PACK67
    68. 68. CRO and Testing resources• 101 Landing page tips : slidesha.re/8OnBRh• 544 Optimisation tips : bit.ly/8mkWOB• 108 Optimisation tips : bit.ly/3Z6GrP• 32 CRO tips : bit.ly/4BZjcW• 57 CRO books : bit.ly/dDjDRJ• CRO article list : bit.ly/nEUgui• Smashing Mag article : bit.ly/8X2fLk68
    69. 69. Ad HocLocal HeroesChaotic GoodLevel 1Starter LevelGuessingA/B testingBasic toolsAnalyticsSurveysContact CentreLow budgetusabilityOutline processSmall teamLow hanging fruit+ Multi variateSession replayNo segments+Regular usabilitytesting/researchPrototypingSession replayOnsite feedback_____________________________________________________________________________________________ _Dedicated teamVolumeopportunitiesCross silo teamSystematic testsNinja TeamTesting in theDNAWell developed Streamlined Company wide+FunneloptimisationCall trackingSome segmentsMicro testingBounce ratesBig volumelanding pages+ Funnel analysisLow converting& High loss pages+ offlineintegrationSingle channelpicture+ Funnel fixesForms analyticsChannel switches+Cross channeltestingIntegrated CROand analyticsSegmentation+Spread tool useDynamic adaptivetargetingMachine learningRealtimeMultichannelfunnelsCross channelsynergy_______________________________________________________________________________________________________________________________________________________________________________________________TestingfocusCultureProcessAnalyticsfocusInsightmethods+User CenteredDesignLayered feedbackMini product testsGet buyin________________________________________________________________________________________________Mission Prove ROI Scale the testing Mine valueContinualimprovement+ Customer satscores tied to UXRapid iterativetesting anddesign+ All channel viewof customerDriving offlineusing onlineAll promotiondriven by testingLevel 2Early maturityLevel 3Serious testingLevel 4Core business valueLevel 5You rock, awesomely________________________________________________________________________________________________69
    70. 70. HOMEWORK 1• I’d like you to look at how unconscious action is part of your life every week.• Several times a day, you’ll use a door. There are many different interfaces fordoors like handles, knobs, buttons, push plates, levers and more.• We all go through every day using these things and don’t consciously think aboutwhat we’re doing. You’ll use them at work, at home, when you travel, shop or usethe loo!• There are several things you’ll spot if you keep a door diary for a few days. Let’sgive you 3 weeks to finish, to give you time to fit this in. Here is your work:1. Explore.See how many different types of door interface you can spot. Take pictures ofthem and add notes on your phone or using an app. Take notes if you like with anotepad and pencil. Photographs are especially useful for showing examples.2. Patterns and Groups.Do these door interfaces have a pattern? Do they fit into groups? What wouldyou call these groups?70
    71. 71. HOMEWORK 23. The Furnishings. Take a look at the door furniture and signs:• Is there any other stuff apart from the handle that you look at?• What signs or stuff are plastered on the door?• Are there any messages telling you stuff?4. Error with door. What happens when it goes wrong?.This is really hard to catch but if you keep trying for a week, you’ll spot a few. What happens when the doorinterface goes wrong? You get it the wrong way, curse to yourself and then do something different. What doyou notice about when this happens?5. What caused it?What was it, when it all went wrong, that led you to ‘get the door wrong’ and have to try again. What wentwrong that didn’t happen with all the other doors? If you watch the door, does it happen to other people?6. SummarySo keep a log if you can (scribbled notes, mobile phone app, photos) and look at the five things I’ve listed. Theremight be more stuff than I’m hinting at so observe closely.• How many different ‘kinds’ of door interface can you spot?• Are there groups of them – similar kinds? What would you call these?• Catch yourself when it goes wrong• Watch other people when it goes wrong• Why did it go wrong71
    72. 72. Collecting the EvidenceApps• https://itunes.apple.com/au/app/this-is-note-calendar-+-photoalbums/id403746123?mt=8• https://itunes.apple.com/us/app/awesome-note-+to-do-calendar/id320203391?mt=8• http://www.blurb.com/mobileInbox or Stream based• http://www.memonic.com/tour#web-clipper• https://launch.unifiedinbox.com/72
    73. 73. END SLIDES73
    74. 74. BONUS DECKHope you find this useful – a small bonus here with someslides about conversion optimisation methodologies andhow you should try to structure your approach.74
    75. 75. #2 CRO Project Styles75
    76. 76. What’s the problem?• #1 User Experience and ConversionOptimisation are not a checkbox or astep in the process – it needs to beintegrated in everything you do.• #2 This work isn’t a one off exerciseeither – it’s an ongoing continuousimprovement process – like Kaizen• #3 Usability testing isn’t enough –other UX factors like the visceral andbehavioural emotional responses wehave to products need tuning too.• #4 It’s not just about the user!• #5 It’s usually Self Centred Designdriven by Ego, Opinion, Assumption76
    77. 77. 77
    78. 78. Also, the dial won’t turn anymoreWith thanks to @morysPPC SEO78
    79. 79. Why is this happening?• PPC changes• Advertising models flattening – i.e. mobile Google costs• Comprehensive SEO changes• Competition increasing• Fleetness of foot – Asos in Australia• Entry costs are lower now• New entrants compete without the cruft• Startups are using better ‘build and optimise’ methodologiesthan nearly all corporates• The old way of doing things is going to die• The new way of doing things is your only survival ticket79
    80. 80. So what do people do?• They throw tools at the problem• They try usability testing and research• They generate more data to look at• They make changes without measuring or testing• They hope to randomly create the optimal system• They get an expensive agency to help them• They push their team harder, like galley slaves• They experiment with riskier advertising models• They wonder why they’re burning rubber• They try more random things• Then they call a CRO person and say“We’ve tried everything. It isn’t working. Help!”80
    81. 81. Skinner’s Pigeon Experiment• Participants invited into a room with objects• Told to score 100 points within 30 minutes• Participants moved objects around, madenoise, jumped around, tried anything to make acounter increase the points score.• They got horribly confused• They then created convincing lies forthemselves, to explain what they thought wasworking. They became superstitious and maderituals.• The points allocation was made randomly by agoldfish, swimming back and forward in a tank.• Know any marketing departments like this?• I’ve seen this a lot – and it paralyses companies• We need a better way. A methodology?81
    82. 82. 82
    83. 83. Lean UXPositive– Lightweight and very fast methods– Realtime or rapid improvements– Documentation light, value high– Low on wastage and frippery– Fast time to market, then optimise– Allows you to pivot into new areasNegative– Often needs user test feedback tosteer the development, as data notenough– Bosses distrust stuff where theoutcome isn’t known“The application of UX design methods into productdevelopment, tailored to fit Build-Measure-Learn cycles.”83
    84. 84. Agile UX / UCD / Collaborative DesignPositive– User centric– Goals met substantially– Rapid time to market (especially whenusing Agile iterations)Negative– Without quant data, user goals candrive the show – missing the businesssweet spot– Some people find it hard to integratewith siloed teams– Doesn’t’ work with waterfall IMHOWireframePrototypeTestAnalyseConceptResearch“An integration of User Experience Design and Agile*Software Development Methodologies”*Sometimes84
    85. 85. CRO85
    86. 86. Lean Conversion OptimisationPositive– A blend of several techniques– Multiple sources of Qual and Quant data aids triangulation– CRO analytics focus drives unearned value inside allproductsNegative– Needs a one team approach with a strong PM who is aPolymath (Commercial, Analytics, UX, Technical)– Only works if your teams can take the pace – you might besurprised though!“A blend of User Experience Design, Agile PM, Rapid LeanUX Build-Measure-Learn cycles, triangulated datasources, triage and prioritisation.”86
    87. 87. Lean CROInspectionImmersionIdentifyTriage &TriangulateOutcomeStreamsMeasureLearnInstrument87
    88. 88. Triage and Triangulation• Starts with the analytics data• Then UX and user journey walkthrough from SERPS -> key paths• Then back to analytics data for a whole range of reports:• Segmented reporting, Traffic sources, Device viewport andbrowser, Platform (tablet, mobile, desktop) and many more• We use other tools or insight sources to help form hypotheses• We triangulate with other data where possible• We estimate the potential uplift of fixing/improving somethingas well as the difficulty (time/resource/complexity/risk)• A simple quadrant shows the value clusters• We then WORK the highest and easiest scores by…• Turning every opportunity spotted into an OUTCOME“This is where the smarts of CRO are – in identifying theeasiest stuff to test or fix that will drive the largest uplift.”88
    89. 89. The Bucket Methodology“Helps you to stream actions from the insights and prioritisation work.Forces an action for every issue, a counter for every opportunity being lost.” TestIf there is an obvious opportunity to shift behaviour, expose insight orincrease conversion – this bucket is where you place stuff for testing. Ifyou have traffic and leakage, this is the bucket for that issue. InstrumentIf an issue is placed in this bucket, it means we need to beef up theanalytics reporting. This can involve fixing, adding or improving tag orevent handling on the analytics configuration. We instrument bothstructurally and for insight in the pain points we’ve found. HypothesiseThis is where we’ve found a page, widget or process that’s just not workingwell but we don’t see a clear single solution. Since we need to really shiftthe behaviour at this crux point, we’ll brainstorm hypotheses. Driven byevidence and data, we’ll create test plans to find the answers to thequestions and change the conversion or KPI figure in the desired direction. Just Do ItJFDI (Just Do It) – is a bucket for issues where a fix is easy to identify or thechange is a no-brainer. Items marked with this flag can either be deployedin a batch or as part of a controlled test. Stuff in here requires low effortor are micro-opportunities to increase conversion and should be fixed. Investigate You need to do some testing with particular devices or need moreinformation to triangulate a problem you spotted. If an item is in thisbucket, you need to ask questions or do further digging. 89
    90. 90. How is it working out?• Methodologies are not Real Life ™• It’s mainly about the mindset of the team andmanagers, not the tools or methodologies theyplay with• Not all my clients have all the working parts• You should not be a methodology slave• Feel free to make your own or flexibly adapt• Use some, any techniques instead of ‘guessing’• Blending lean and agile with conversionoptimisation outcomes is my critical learning ofthe last 5 years• Doing rapid cycles of this outcome driven workfor Belron:• World Conversion Rate Increase:2009 +5%, 2010 +10%, 2011 +15%, 2012 +25%• If you’d like to develop a good one for yourcompany, talk to me first!• Don’t over complicate it. 90