A roundup of all the things to help you maintain a competitive edge in experience design and conversion optimisation. With examples of companies putting this stuff together, the tools they are using and their project management approaches, this presentation delves deeper into the cultural aspects of CRO.
Surviving the AB Testing Hype Cycle - Reaktor Breakpoint 2015Craig Sullivan
My Slides from Reaktor Breakpoint 2015 - This is by far the best deck (and hopefully talk) I've done this year. Masses of info, reading, articles, useful reports and more.
Surviving the hype cycle Shortcuts to split testing successCraig Sullivan
In this talk, I show the key shortcuts to stop doing stupid testing and move towards innovative and transformative design & build methodologies, including innovation through split testing exploration
Web Analytics Wednesday - Session Replay Tools are VitalCraig Sullivan
Session Replay or Screen Recording tools are now part of an arsenal of discovery toolkits that can drive optimisation, bug fixes, funnel and journey analysis - using qual and quant techniques. Without these tools, the analytics data misses emotion, frustration, friction and more - I've collated the best tips, tricks, tools and approaches to yield the most valuable insights for CRO / Growth Hacking.
Cross Device Optimisation - Google Analytics ShortcutsCraig Sullivan
In this session, we explain how to mine GA for broken device experiences, flows, funnel blocks and more... Using a new grid tool we've developed, you can pull multi-dimensional segmented funnel and metric data from Google Analytics - we explain how it works, why you need it and what problems it solves. Find where your site is leaking money through data
Slides to go with a talk on rapid, lightweight research you can do before tackling a landing page, funnel step or lead-gen form. Comes complete with all the Google Analytics reports you'll need to mine useful data to share! Less bullshit, more truth in meetings!
The Neuromarketing Toolkit - Chinwag Psych - 4 Feb 2014Craig Sullivan
A practical toolkit for getting inside customers heads, in order to design and create persuasive psychological approaches to copy, pages, buttons, designs and your entire service. Craig shows you here how to mine what you already have - to design a better bank balance and continuously improving future for your company, staff and your customers.
Surviving the AB Testing Hype Cycle - Reaktor Breakpoint 2015Craig Sullivan
My Slides from Reaktor Breakpoint 2015 - This is by far the best deck (and hopefully talk) I've done this year. Masses of info, reading, articles, useful reports and more.
Surviving the hype cycle Shortcuts to split testing successCraig Sullivan
In this talk, I show the key shortcuts to stop doing stupid testing and move towards innovative and transformative design & build methodologies, including innovation through split testing exploration
Web Analytics Wednesday - Session Replay Tools are VitalCraig Sullivan
Session Replay or Screen Recording tools are now part of an arsenal of discovery toolkits that can drive optimisation, bug fixes, funnel and journey analysis - using qual and quant techniques. Without these tools, the analytics data misses emotion, frustration, friction and more - I've collated the best tips, tricks, tools and approaches to yield the most valuable insights for CRO / Growth Hacking.
Cross Device Optimisation - Google Analytics ShortcutsCraig Sullivan
In this session, we explain how to mine GA for broken device experiences, flows, funnel blocks and more... Using a new grid tool we've developed, you can pull multi-dimensional segmented funnel and metric data from Google Analytics - we explain how it works, why you need it and what problems it solves. Find where your site is leaking money through data
Slides to go with a talk on rapid, lightweight research you can do before tackling a landing page, funnel step or lead-gen form. Comes complete with all the Google Analytics reports you'll need to mine useful data to share! Less bullshit, more truth in meetings!
The Neuromarketing Toolkit - Chinwag Psych - 4 Feb 2014Craig Sullivan
A practical toolkit for getting inside customers heads, in order to design and create persuasive psychological approaches to copy, pages, buttons, designs and your entire service. Craig shows you here how to mine what you already have - to design a better bank balance and continuously improving future for your company, staff and your customers.
Myths, Lies and Illusions of AB and Split TestingCraig Sullivan
What are the common assumptions about AB (split) testing that are wrong? What are the lies told by vendors, consultants and the stuff you have convinced yourself about. What is illusory - what can you trust - what's it really all about. 20 top myths debunked after asking fellow CRO professionals what is on THEIR top list.
Mobile presentation - Sydney Online Retailer - 26 Sep 2011Craig Sullivan
In this presentation, I use analytics data from our global mobile reach, to illustrate the trends that are driving growth, how to take opportunity from them and what to do with your own site. I present a case for device and user knowledge, to allow you to optimise conversion rates, revenue and delight for visitors.
Condensed testing syrup - @OptimiseorDie @sydney sep 2011 - 4 years of testin...Craig Sullivan
A summary of my 4 years of A/B and Split testing, with case studies of work, photography guidelines, and key advice on which elements of the page to test for quick wins.
I enjoyed giving this talk at the Online Retailer Conference in Sydney, which is a fine place to visit.
The presentation c.overs a really good 'pizza' analogy for explaining testing to senior management and budget holders. Then covering how you go about discovering good places to test on your site, and what tools will help you get that data.
Lastly, I explore what worked for me in testing, show some examples of how similar our winners are across the globe and then cover some cross channel testing. The last one here is a big growth area and involves optimising contact centres and channels, using the web as a tool. Some interesting work going on here and I show some of ours, as well as new things in the pipeline.
There are some great resources attached, including a list of remote user testing services and the best 'guides' I could find on 'Conversion Rate Optimisation'. Hope you enjoyed the talk and thank you Sydney.
Myths and Illusions of Cross Device Testing - Elite Camp June 2015Craig Sullivan
A compendium of the most common mistakes and problems people encounter when trying to optimise or split test cross device experiences (mobile, tablet, desktop, app, tv etc.)
#Measurecamp : 18 Simple Ways to F*** up Your AB TestingCraig Sullivan
An expanded deck of the top 18 blockers to getting successful AB or Multivariate test results. In this deck, you get a complete checklist of the stuff you need to prepare, watch, launch and monitor your testing, so it gets you the *right* conclusions.
The top reasons and solutions for not getting value out of your AB tests - some practical tips for designing insightful and correctly instrumented test
20 Ways to Shaft your Split Tesring : Conversion ConferenceCraig Sullivan
This talk is the latest deck showing common problems that will easily break or skew your ab and multivariate testing results. Avoid these problems by following the simple advice in this deck!
Brighton CRO Meetup #1 - Oh Boy These AB tests Sure Look Like Bullshit to MeCraig Sullivan
An updated deck of a short talk (30m) given at the first Brighton CRO meetup. Contains useful AB testing tools as well as full speaker notes for most of the slides.
Product design is Poo - And how to fix it!Craig Sullivan
A look at why product design is still so poor, even after 22 years of digital design work. Why do these problems exist and how can we remove them from the way we build products? Lean corporate and startup growth models are explored in the solutions to this horrendous problem!
Product Design is Poo - And we're all going to dieCraig Sullivan
A humorous presentation about what is wrong with the current way of building digital products. Showing what is wrong, explaining the signs and giving you a checklist for reforming your company - are laid out with links, resources and further reading.
Why Does My Conversion Rate Suck? Craig Sullivan, Senior Optimisation Consult...PRWD
Craig Sullivan, Senior Optimisation Consultant covers the top 10 reasons why your conversion rate might suck. Packed with actionable tips and resources this presentation is for anyone wanting to improve their Conversion Optimisation. Craig covers common problems and topic areas such as issues with Google Analytics setup, inputs, tools, testing, testing cycles, product cycles, Photo UX, how to analyse statistics / Data, segmentation, multiple channel optimisation. The resource pack also include a maturity model, Crowd-sourced UX, collaborative tools, testing tools for CRO & QA, Belron Methodology example, and CRO and testing resources.
Onboard like a juggernaut - Elite camp 2015Conversionista
Conversionista's presentation at Digital EliteCamp in Estonia 2105.
Find the critical conversion points in your SaaS user onboarding journey.
- Get Registered Prospects to use the service
- Get Active users to pay up
- Get Paying Customers to stay
- Get churned users to return
- Get all of them to refer more users
Do this - And rock!
Craig Sullivan - Keynote speaker summary & final thoughts - Conversion Hotel ...Webanalisten .nl
Slides of the keynote by Craig Sullivan (UK) at Conversion Hotel 2015, Texel, the Netherlands (#CH2015): "You already listened to 10 keynotes – number 11 will refresh your memory, make you laugh and will leave you with some final thoughts for the trip home." http://conversionhotel.com
SAMPLE SIZE – The indispensable A/B test calculation that you’re not makingZack Notes
If you’re a marketer it’s very likely that you’ve run an A/B test. It’s also likely that you’ve never calculated the sample size for your tests, and instead, you run tests until they reach statistical significance. If this is the case, your strategy is statistically flawed. Conforming to sample size requires marketers to wait longer for test results, but choosing to ignore it will bear false positives and lead to bad decisions.
This deck was created for an email audience for there are valuable lessons for anyone who runs A/B tests.
Myths, Lies and Illusions of AB and Split TestingCraig Sullivan
What are the common assumptions about AB (split) testing that are wrong? What are the lies told by vendors, consultants and the stuff you have convinced yourself about. What is illusory - what can you trust - what's it really all about. 20 top myths debunked after asking fellow CRO professionals what is on THEIR top list.
Mobile presentation - Sydney Online Retailer - 26 Sep 2011Craig Sullivan
In this presentation, I use analytics data from our global mobile reach, to illustrate the trends that are driving growth, how to take opportunity from them and what to do with your own site. I present a case for device and user knowledge, to allow you to optimise conversion rates, revenue and delight for visitors.
Condensed testing syrup - @OptimiseorDie @sydney sep 2011 - 4 years of testin...Craig Sullivan
A summary of my 4 years of A/B and Split testing, with case studies of work, photography guidelines, and key advice on which elements of the page to test for quick wins.
I enjoyed giving this talk at the Online Retailer Conference in Sydney, which is a fine place to visit.
The presentation c.overs a really good 'pizza' analogy for explaining testing to senior management and budget holders. Then covering how you go about discovering good places to test on your site, and what tools will help you get that data.
Lastly, I explore what worked for me in testing, show some examples of how similar our winners are across the globe and then cover some cross channel testing. The last one here is a big growth area and involves optimising contact centres and channels, using the web as a tool. Some interesting work going on here and I show some of ours, as well as new things in the pipeline.
There are some great resources attached, including a list of remote user testing services and the best 'guides' I could find on 'Conversion Rate Optimisation'. Hope you enjoyed the talk and thank you Sydney.
Myths and Illusions of Cross Device Testing - Elite Camp June 2015Craig Sullivan
A compendium of the most common mistakes and problems people encounter when trying to optimise or split test cross device experiences (mobile, tablet, desktop, app, tv etc.)
#Measurecamp : 18 Simple Ways to F*** up Your AB TestingCraig Sullivan
An expanded deck of the top 18 blockers to getting successful AB or Multivariate test results. In this deck, you get a complete checklist of the stuff you need to prepare, watch, launch and monitor your testing, so it gets you the *right* conclusions.
The top reasons and solutions for not getting value out of your AB tests - some practical tips for designing insightful and correctly instrumented test
20 Ways to Shaft your Split Tesring : Conversion ConferenceCraig Sullivan
This talk is the latest deck showing common problems that will easily break or skew your ab and multivariate testing results. Avoid these problems by following the simple advice in this deck!
Brighton CRO Meetup #1 - Oh Boy These AB tests Sure Look Like Bullshit to MeCraig Sullivan
An updated deck of a short talk (30m) given at the first Brighton CRO meetup. Contains useful AB testing tools as well as full speaker notes for most of the slides.
Product design is Poo - And how to fix it!Craig Sullivan
A look at why product design is still so poor, even after 22 years of digital design work. Why do these problems exist and how can we remove them from the way we build products? Lean corporate and startup growth models are explored in the solutions to this horrendous problem!
Product Design is Poo - And we're all going to dieCraig Sullivan
A humorous presentation about what is wrong with the current way of building digital products. Showing what is wrong, explaining the signs and giving you a checklist for reforming your company - are laid out with links, resources and further reading.
Why Does My Conversion Rate Suck? Craig Sullivan, Senior Optimisation Consult...PRWD
Craig Sullivan, Senior Optimisation Consultant covers the top 10 reasons why your conversion rate might suck. Packed with actionable tips and resources this presentation is for anyone wanting to improve their Conversion Optimisation. Craig covers common problems and topic areas such as issues with Google Analytics setup, inputs, tools, testing, testing cycles, product cycles, Photo UX, how to analyse statistics / Data, segmentation, multiple channel optimisation. The resource pack also include a maturity model, Crowd-sourced UX, collaborative tools, testing tools for CRO & QA, Belron Methodology example, and CRO and testing resources.
Onboard like a juggernaut - Elite camp 2015Conversionista
Conversionista's presentation at Digital EliteCamp in Estonia 2105.
Find the critical conversion points in your SaaS user onboarding journey.
- Get Registered Prospects to use the service
- Get Active users to pay up
- Get Paying Customers to stay
- Get churned users to return
- Get all of them to refer more users
Do this - And rock!
Craig Sullivan - Keynote speaker summary & final thoughts - Conversion Hotel ...Webanalisten .nl
Slides of the keynote by Craig Sullivan (UK) at Conversion Hotel 2015, Texel, the Netherlands (#CH2015): "You already listened to 10 keynotes – number 11 will refresh your memory, make you laugh and will leave you with some final thoughts for the trip home." http://conversionhotel.com
SAMPLE SIZE – The indispensable A/B test calculation that you’re not makingZack Notes
If you’re a marketer it’s very likely that you’ve run an A/B test. It’s also likely that you’ve never calculated the sample size for your tests, and instead, you run tests until they reach statistical significance. If this is the case, your strategy is statistically flawed. Conforming to sample size requires marketers to wait longer for test results, but choosing to ignore it will bear false positives and lead to bad decisions.
This deck was created for an email audience for there are valuable lessons for anyone who runs A/B tests.
AB Testing and UX - a love story with numbers and people (by Craig Sullivan a...Northern User Experience
AB Testing and UX - a love story with numbers and people
Slides from the NUX6 talk by Craig Sullivan, Friday 27th October 2017.
2017.nuxconf.uk / nuxuk.org
Synopsis:
What’s wrong with the web these days? The mobile experience sucks. The customer experience sucks. It doesn’t work. It’s too hard to use. The text is too small. Nobody measures this happening. The interaction patterns suck. Nobody ever calls up to complain but nobody does anything anyway. Millions of people lose countless days to friction, poor design and frustrating moments on their devices.
There may be thousands of things you can fix that look promising – but how do you know where to start? What if you could measure what sucked, where it sucked and how big the problem was? Using lightweight research methods and tools, you can stop making excuses and start knowing exactly what to do. Life becomes much simpler and easier with a scientific method of optimising growth or delight within your product.
Craig has trained over 500 people on how to measure and optimise their product experience, finding 100M of ‘lost revenue’ using just one of the techniques you will learn. With reports, checklists, downloadable templates and toolkits for every budget and stage of growth – you can stop guessing tomorrow.
The tools used by the CRO masters round the world to optimise analytics, UX, VOC,insight and testing - all to optimise your insight or conversion figures.
Data Driven Website Decisions for SEO and CROAmanda King
My presentation for Digimarcon Sydney on 29 Aug 2019.
For many, SEO is a black box that doesn't open up easily to share with stakeholders to prioritise and understand the value to the business. This presentation shares tactics on how to equate both overall SEO strategy and individual SEO tactics to business value.
SEO Server Log File Analysis - What You Should Be Looking For - Tea-Time SEO ...Authoritas
Get practical technical SEO advice from SEO experts: Hosted by Jason Barnard, "The Brand SERPs guy" with three great speakers: Tom Pool, Technical SEO Director at BlueArray; Faisal Anderson, Technical SEO EMEA @ LiveArea and Julien Deneuville, Owner / Freelance consultant @ Databulle.
In this short ~20 minute talk they present bite-sized technical SEO advice covering everything you need to know and more about server log file analysis and crawling for SEO. There talks are offered free to the SEO community working from home during the coronavirus pandemic.
Watch a recording of the stream to go with these slides here:
https://www.youtube.com/watch?v=Mw3YEYsVQOE
Visit https://www.authoritas.com for more SEO advice and SEO tools and data to help you drive more organic traffic to your ecommerce stores.
You Spoke, We Listened – Achieving a New Level of Search Optimization with Go...Concept Searching, Inc
Join us to learn how the combination of Google Search Appliance and conceptClassifier improves search outcomes. Concept Searching’s industry recognized technology, still unique in the market, provides the Google Search Appliance index with automatically generated semantic metadata, based on our core compound term processing, that generates multi-term metadata that represents concepts.
If you are using Google Search Appliance or evaluating search options, this webinar will provide highly useful information to understand how conceptClassifier improves search results in any search environment.
How does conceptClassifier improve Google Search Appliance results?
• Eliminates end user tagging.
• Automatically generates keywords and phrases that represent a concept.
• Accepts queries in natural language, with the user typing words, phrases or whole sentences, and inter-relationships among content,
• Retrieves highly relevant content even if the results do not contain the search string,
• Improves dynamic navigation through the use of semantic metadata.
• Improves result biasing as the results will automatically include related keywords and concepts.
• Eliminates the development of collections as the taxonomy hierarchy can be used to guide users to specific content based on their role and function – this also applies to segmentation.
• Can load synonym lists, but are not needed as the taxonomy manager component will recognize synonyms as well as related concepts during auto-classification. The results can be easily and quickly modified by an administrator.
• Blacklist and stop words can also be managed in the taxonomy component, without creating and uploading files and eliminating human subjectivity.
• Taxonomies are easily tuned and managed by Subject Matter Experts, providing unique features not available in any other taxonomy product.
Research and Discovery Tools for Experimentation - 17 Apr 2024 - v 2.3 (1).pdfVWO
You can utilize various forms of Generative Research to deepen your understanding of how people interact with your product or service.
Craig has amassed a vast toolkit of research methods, which he has employed to optimize websites and apps for over 500 companies. He'll share which methods yielded the highest return on investment, identified key customer pain points, and generated the best experiment ideas.
By sharing the top inspection methods essential for our work, Craig will provide advice for each technique. Anticipate insights on driving experiment hypotheses from research, a list of essential toolkit components for tomorrow, and additional resources for further reading.
Search and Social Media Marketing Course Slides - Salford UniverstiyTom Mason
Slides from Delineo Head of SEO Phil Morgan's session at SEO Training: Search and Social Media Marketing Course from Salford University. http://www.searchmarketing.salford.ac.uk/
Discover actionable SEO tips from industry thought leaders at Online SEMrush and the Melbourne SEO Meetup.
The presentation covers the following topics:
1. Copywriting for Search Success (Jim Stewart)
Jim will dive into a real-world case study and share tips on how to rank in Google's Featured Answer.
2. Basic E-Commerce Site Audit (Tim Capper)
Beginners guide to Auditing your E-Commerce site. Tim will cover the most common problems encountered within an E-Commerce site, and how to check for these problems using both free and paid tools.
3. 10 Techniques to Convert Your Search Visits (Frederic Chanut)
Frederic will share his tips on how to increase profits to online business by understanding what exactly makes audience become customers.
4. WordPress SEO Tips and Tricks (Chris Burgess)
Chris will share the essential steps to creating and optimizing the perfect WordPress website, popular SEO plugins to use and common mistakes marketers make to when working with WordPress.
You can find more tips here: https://www.semrush.com/webinars/online-seo-meetup/
Similar to 12 Things to do Before Your Company Dies : Conversion Conference London - Oct 2013 (20)
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
12. #1 : Fix your broken Analytics!
@OptimiseOrDie
13. #1 : Fix your broken Analytics!
@OptimiseOrDie
14. #1 : Common problems (GA)
• Dual purpose goal page
– One page used by two outcomes – and not split
• Cross domain tracking
– Where you jump between sites, this borks the data
• Filters not correctly set up
– Your office, agencies, developers are skewing data
• Code missing or double code
– Causes visit splitting, double pageviews, skews bounce rate
• Campaign, Social, Email tracking etc.
– External links you generate are not setup to record properly
• Errors not tracked (404, 5xx, Other)
– You are unaware of error volumes, locations and impact
• Dual flow funnels
– Flows join in the middle of a funnel or loop internally
• Event tracking skews bounce rate
– If an event is set to be „interactive‟ – it can skew bounce rate (example)
@OptimiseOrDie
15. #1 : Solutions
• Get a Health Check for your Analytics
– Try @prwd, @danbarker, @peter_oneill or ask me!
• Invest continually in instrumentation
– Aim for at least 5% of dev time to fix + improve
• Stop shrugging : plug your insight gaps
– Change „I don‟t know‟ to „I‟ll find out‟
• Look at event tracking (Google Analytics)
– If set up correctly, you get wonderful insights
• Would you use paper instead of a till?
– You wouldn‟t do it in retail so stop doing it online!
• How do you win F1 races?
– With the wrong performance data, you won‟t
@OptimiseOrDie
16. #2 : Get the right inputs
Insight - Inputs
Opinion
Cherished
notions
Marketing
whims
Cosmic rays
Not ‘on
brand’
enough
Ego
IT
inflexibility
Panic
Internal
company
needs
#FAIL
Competitor
change
An article
the CEO
read
Some
dumbass
consultant
Competitor
copying
Dice rolling
Guessing
Knee jerk
reactons
Shiny
feature
blindness
@OptimiseOrDie
17. #2 : Get the right inputs
Insight - Inputs
Usability
testing
Forms
analytics
Search
analytics
Voice of
Customer
Market
research
Eye tracking
Customer
contact
A/B and
MVT testing
Big &
unstructured
data
Insight
Social
analytics
Session
Replay
Web
analytics
Segmentation
Sales and
Call Centre
Surveys
Customer
services
Competitor
evals
@OptimiseOrDie
18. #2 : Solutions
• Usability testing and User Centred design
– If you‟re not doing this properly, you‟re hosed
• Champion UX+ - with added numbers
– (Re)designing without inputs + numbers is
guessing
• You need one team on this, not silos
– Stop handing round the baby (I‟ll come back to
this)
• Ego, Opinion, Cherished notions – fill gaps
– Fill these vacuums with insights and data
• Champion the users
– Someone needs to take their side!
• You need multiple tool inputs
– Let me show you my shortlist…
@OptimiseOrDie
19. #3 : Get the right tools
Insight - Inputs
@OptimiseOrDie
20. 3.1 - Session Replay
•
•
•
•
•
Vital for optimisers & fills in a ‘missing link’ for insight
Rich source of data on visitor experiences
Segment by browser, visitor type, behaviour, errors
Forms Analytics (when instrumented) are awesome
Can be used to optimise in real time!
Session replay tools
• Clicktale (Client)
• SessionCam (Client)
• Mouseflow (Client)
• Ghostrec (Client)
• Usabilla (Client)
• Tealeaf (Hybrid)
• UserReplay (Server)
www.clicktale.com
www.sessioncam.com
www.mouseflow.com
www.ghostrec.com
www.usabilla.com
www.tealeaf.com
www.userreplay.com
@OptimiseOrDie
23. 3.2 - Feedback / VOC tools
•
•
•
•
Anything that allows immediate realtime onpage feedback
Comments on elements, pages and overall site & service
Can be used for behavioural triggered feedback
Tip! : Take the Call Centre for beers
• Kampyle
www.kampyle.com
• Qualaroo
www.qualaroo.com
• 4Q
4q.iperceptions.com
• Usabilla
www.usabilla.com
24. 3.3 - Survey Tools
• Surveymonkey
• Zoomerang
• SurveyGizmo
www.surveymonkey.com
www.zoomerang.com
www.surveygizmo.com
(1/5)
(3/5)
(5/5)
• For surveys, web forms, checkouts, lead gen – anything with
form filling – you have to read these two:
Caroline Jarrett (@cjforms)
Luke Wroblewski (@lukew)
• With their work and copywriting from @stickycontent, I
managed to get a survey with a 35% clickthrough from email
and a whopping 94% form completion rate.
• Their awesome insights are the killer app I have when
optimising forms and funnel processes for clients.
@OptimiseOrDie
25. 3.4 - UX Crowd tools
Som, feedback
Remote UX tools (P=Panel, S=Site recruited, B=Both)
Usertesting (B)
www.usertesting.com
Userlytics (B)
www.userlytics.com
Userzoom (S)
www.userzoom.com
Intuition HQ (S)
www.intuitionhq.com
Mechanical turk (S)
www.mechanicalturk.com
Loop11 (S)
www.loop11.com
Open Hallway (S)
www.openhallway.com
What Users Do (P)
www.whatusersdo.com
Feedback army (P)
www.feedbackarmy.com
User feel (P)
www.userfeel.com
Ethnio (For Recruiting)
www.ethnio.com
Feedback on Prototypes / Mockups
Pidoco
Verify from Zurb
Five second test
Conceptshare
Usabilla
www.pidoco.com
www.verifyapp.com
www.fivesecondtest.com
www.conceptshare.com
www.usabilla.com
25
37. 3.7 – Split test funnies
• Many tests fail due to QA or browser bugs
– Always do cross browser QA testing – see resources
• Don’t rely on developers saying ‘yes’
– Use your analytics to define the list to test
• Cross instrument your analytics
– You need this to check the test software works
• Store the variant(s) seen in analytics
– Compare people who saw A/B/A vs. A/B/B
• Segment your data to find variances
– Failed tests usually show differences for segments
• Watch the test and analytics CLOSELY
– After you go live, religiously check both
– Read this article : stanford.io/15UYov0
@OptimiseOrDie
38. #3 : Summary – the minimum!
• Session replay tools
– Clicktale, Tealeaf, Sessioncam and more…
• Cheap / Crowdsourced usability testing
– See the list in this deck
• Voice of Customer / Feedback tools
– 4Q, Kampyle, Qualaroo, Usabilla and
more…
• A/B and Multivariate testing
– Optimizely, Google Content Experiments,
VWO
• Email, Browser and Mobile Device Testing
– You don‟t know if it works unless you check
@OptimiseOrDie
40. Methodologies - Lean UX
“The application of UX design methods into product
development, tailored to fit Build-Measure-Learn cycles.”
Positive
–
–
–
–
–
–
Lightweight and very fast methods
Realtime or rapid improvements
Documentation light, value high
Low on wastage and frippery
Fast time to market, then optimise
Allows you to pivot into new areas
Negative
– Often needs user test feedback to
steer the development, as data not
enough
– Bosses distrust stuff where the
outcome isn’t known
@OptimiseOrDie
41. Agile UX / UCD / Collaborative Design
“An integration of User Experience Design and Agile*
Software Development Methodologies”
*Sometimes
Research
Positive
– User centric
– Goals met substantially
– Rapid time to market (especially when
using Agile iterations)
Negative
– Without quant data, user goals can
drive the show – missing the business
sweet spot
– Some people find it hard to integrate
with siloed teams
– Doesn’t’ work with waterfall IMHO
Wireframe
Concept
Analyse
Prototype
Test
@OptimiseOrDie
43. Lean Conversion Optimisation
“A blend of User Experience Design, Agile PM, Rapid Lean UX
Build-Measure-Learn cycles, triangulated data sources, triage
and prioritisation.”
Positive
– A blend of several techniques
– Multiple sources of Qual and Quant data aids triangulation
– CRO analytics focus drives unearned value inside all
products
Negative
– Needs a one team approach with a strong PM who is a
Polymath (Commercial, Analytics, UX, Technical)
– Only works if your teams can take the pace – you might be
surprised though!
@OptimiseOrDie
45. #4 : Solutions – Agile
• Design your own methodology
Experiment and optimise with your team
• Don’t be a slave
The methodology is the slave, not your master
• Ask me later…
Questions – see me on Twitter, G+ or ask by mail
• Collaborative working
– Harvard study into teams – it‟s an „all the time’
thing
@OptimiseOrDie
46. #5 : Build a volume testing culture
@OptimiseOrDie
47. #5 : Common issues
• How many tests do you complete a month?
– Without volume, you won‟t get as much attention or traction
• Vanity testing takes hold
– Getting one test done a quarter? Still showing it a year later?
• Not enough resource
– You MUST hire, invest and ringfence time and staff for CRO
• Testing has gone to sleep
– Some vendors have a „rescue‟ team for these accounts
• You keep testing without buyin at C-Level
– If nobody sees the flower, was it there?
• You haven’t got a process – just a plugin
– Insight, Brainstorm, Wireframe, Design, Build, QA test, Monitor,
Analyse. Tools, Process, People, Time -> INVEST
• IT or release barriers slow down work
– Circumvent with tagging tools
– Develop ways around the innovation barrier
@OptimiseOrDie
48. #5 : Testing Culture tips
• Gamble in the office
– Encourage betting in the office
– Get people to make a prediction, get them hooked
• Put BIG pictures on the wall
– Make people stop and ask questions
– Encourage betting in the office
• Market the results like a pro
– Market this stuff internally like a PR agency
– Send emails around, put on the intranet, optimise internally
• Offer a company wide prize
–
–
–
–
–
Offer a prize for best hypothesis that makes it to a test
Do a monthly or quarterly award – make it visible
A great incentive to come up with provable, winning ideas
Do NOT limit to the testing team
Run a TESTATHON to generate bulk ideas
@OptimiseOrDie
50. #6 : Execution problems
• Silo Mentality means pass the product
– No „one team‟ approach means no „one product‟
• The process is badly designed
– It‟s set up to introduce delays or loops of tweaking
• People mistake hypotheses for finals
– Endless argument, tweaking means NO TESTING – let the
test decide, please!
• No clarity : authority or decision making
– You need a strong leader to get things decided
• Signoff takes far too long
– Signoff by committee is a velocity killer – the CUSTOMER
and the NUMBERS are the signoff
• You set your target too low
– Aim for a high target and keep increasing it
@OptimiseOrDie
51. #6 : Execution solutions
• Agile, One Team approach
– Everyone works on the lifecycle, together
• Hire Polymaths
– T-shaped or just multi-skilled, I hire them a lot
• Use Collaborative Tools, not meetings
– Speeds up time to decide, iterate, resolve, remove defects
• Smash down silos – a special mission
–
–
–
–
Involve the worst offenders in the hypothesis team
“Hold your friends close, and your enemies closer”
Work WITH the developers to find solutions
Ask Developers and IT for solutions, not apologies
@OptimiseOrDie
52. #6 : Product cycles are too
long
Conversion
0
6
12
18
Months
@OptimiseOrDie
53. #6 : Solutions
• Give Priority Boarding for opportunities
– The best seats reserved for metric shifters
• Release more often to close the gap
– More testing resource helps, analytics „hawk eye‟
• Kaizen – continuous improvement
– Others call it JFDI (just f***ing do it)
• Make changes AS WELL as tests, basically!
– These small things add up
• RUSH Hair booking – Over 100 changes
– No functional changes at all – 37% improvement
• Inbetween product lifecycles?
– The added lift for 10 days work, worth 360k
@OptimiseOrDie
55. #7 – Get Experienced
•
Persuasion / Influence /
Direction / Explanation
•
Helps people process
information and stories
•
Vital to sell an
„experience‟
•
Helps people recognise
and discriminate
between things
•
Supports Scanning Visitors
•
Drives emotional
response
short.cx/YrBczl
24 Jan 2012
56. Photo UX
•
Very powerful and under-estimated area
•
I‟ve done over 20M visitor tests with people
images for a service industry – some tips:
•
The person, pose, eye gaze, facial
expressions and body language – cause
visceral emotional reactions and big
changes in behaviour
•
Eye gaze crucial – to engage you or to
„point‟
24 Jan 2012
57. Photo UX
• Negative body language is a turnoff
• Uniforms and branding a positive (ball cap)
• Hands are hard to handle – use a prop to
help
• For Ecommerce – tip! test bigger images!
• Autoglass and Belron always use real
people
• In most countries (out of 33) with strong
female and male images in test, female
wins
• Smile and authenticity in these examples is
absolutely vital
• So, I have a question for you
@OptimiseOrDie
62. #8 : Get Multichannel
• Not using call tracking
– Look at Infinity Tracking (UK)
– Get Google keyword level call volumes!
• You don’t measure channel switchers
– People who bail a funnel and call
– People who use chat, go to store, use ringback, email etc.
• You ‘forget’ mobile & tablet journeys
– Walk the path from search -> ppc/seo -> site
– Optimise for all your device mix & journeys
• You’re responsive
– Testing may now bleed across device platforms
– Changing in one place may impact many others
– QA, Device and Browser testing even more vital
@OptimiseOrDie
63. #8 : Multichannel – cross device
•
•
•
•
•
•
•
•
•
One reason conversion is lower is channel jumps
You need to try and glue this together
Don’t rely on amazing analytics promises
If you don’t have a strategy here, you need one
You need to think about identifying the customer
At every touch point and micro or macro interaction
Loyalty, marketing, wifi, incentives, geo, instore = glue
What’s your strategy on post identification?
Do you rewrite the data or just ignore it?
@OptimiseOrDie
64. #9 : Get Segmented on Testing
• Averages lie
– What about new vs. returning visitors?
– What about different keyword groups?
– Landing pages? Routes? Attributes
• Failed tests are just ‘averaged out’
– You must look at segment level data
– You must integrate the analytics + a/b test software
• The downside?
– You‟ll need more test data – to segment
• The upside?
– Helps figure out why test didn‟t perform
– Finds value in failed or „no difference‟ tests
– Drives further testing focus
@OptimiseOrDie
65. #10 : Get stats training
• Many testers & marketing people struggle
–
–
–
–
–
How long will it take to run the test?
Is the test ready?
How long should I keep it running for?
It says it‟s ready after 3 days – is it?
Can we close it now – the numbers look great!
• A/B testing maths for dummies:
– http://bit.ly/15UXLS4
• For more advanced testers:
– Read this : http://bit.ly/1a4iJ1H
• A stats course would be handy:
– A decent guide for testing or marketing pros
– Watch this space…
@OptimiseOrDie
66. #11 : Hire and retain great analysts…
•
•
•
•
•
•
•
•
•
•
Long topic – mail me for killer questions
Pay above market rates – 10k less is not ROI
Don‟t make them into report monkeys
Find a knowledge sharer and trainer!
Let them solve business problems
Invest in the analytics, people, continually
Pay extra for specialist help
Give staff direct outcome connections
Freedom to grow and learn is vital
Help them fix broken stuff!
@OptimiseOrDie
67. SUMMARY : The best Companies….
•
•
•
•
•
•
•
•
•
•
•
•
See the Maturity Model in the resource pack
Invest continually in analytics instrumentation, tools, people
Use an Agile, iterative, cross-silo, one team project culture
Prefer collaborative tools to having lots of meetings
Prioritise development based on numbers and insight
Practice real continuous product improvement, not SLED
Source photos and content that support persuasion and utility
Have cross channel, cross device design, testing and QA
Segment their data for valuable insights, every test or change
Continually reduce cycle (iteration) time in their process
Blend ‘long’ design, continuous improvement AND split tests
Make optimisation the engine of change, not the slave of ego
@OptimiseOrDie
68. So you want examples?
•
•
•
•
•
•
•
Belron
Dell
Shop Direct
Expedia
Schuh
TSR Group
Soundcloud
• Gov.uk
–
–
–
–
–
–
–
–
Ed Colley
Nazli Yuzak
Paul Postance (now with EE)
Oliver Paton
Stuart McMillan
Pete Taylor
Eleftherios Diakomichalis &
Ole Bahlmann
Adam Bailin (now with the BBC)
Read the gov.uk principles : www.gov.uk/designprinciples
And my personal favourites of 2013 – Airbnb and Expensify
#PRWDReveal
69. #12 : Do things that don‟t scale
• World Economic Forum‟s Technology Pioneers 2014
• Zero to $3Bn in 5 years – servicing a Wembley every
night
@OptimiseOrDie
• FREE Professional Photography
72. Do things that don‟t scale:
“It‟s better to have 100
people love you experience
than
Create the perfect
1,000,000 need to do
however you like you” it and
Brian Chesky, CEO, experience.
then scale thatAirbnb
It‟s creating an experience. And
then its multiplying them. Too
many people start with „how
many you sell‟ and then they try
to make it better.
Brian Chesky, CEO, Airbnb
@OptimiseOrDie
73. If it isn‟t working, you‟re not doing it right
@OptimiseOrDie
74. Optimise Or Die
• Unless you have a dominant natural or physical
advantage, the playing field is open to anyone.
• Unfettered customer ratings and comments are here to
stay : Think “Net Promoter Score on Steroids”
• Market leaders like Nokia, Blackberry, Best Buy, Borders,
The Washington Post and Blockbuster all got hit – why
not you?
• One investor asks new startups : “What if Amazon did
your thing?”
• Airbnb, Uber, Lyft, Lovefilm, Spotify and Google
(Analytics) are excellent examples of innovation
competition.
• You must design and scale an amazing customer
experience to win
• If your competitor is doing this stuff, you‟ll have to spend
MUCH more to beat them.
@OptimiseOrDie
79. Level 1
Level 2
Starter Level
Culture
Early maturity
Local Heroes
Small team
Low hanging fruit
Chaotic Good
Level 3
Serious testing
Dedicated team
Volume
opportunities
Level 4
Core business value
Cross silo team
Systematic tests
Level 5
You rock, awesomely
Ninja Team
Testing in the
DNA
________________________________________________________________________
Company wide
Ad Hoc
Well developed
Streamlined
Process
_____________________ _ Outline process
________________________________________________________________________
+Spread tool use
+Funnel
+Cross channel
_______________________+ Multi variate
Guessing
optimisation
testing
Testing
focus
A/B testing
Basic tools
Session replay
No segments
Call tracking
Some segments
Micro testing
Integrated CRO
and analytics
Segmentation
Dynamic adaptive
targeting
Machine learning
Realtime
________________________________________________________________________
Multichannel
+ offline
+
Bounce rates
+ Funnel fixes
________________________Funnel analysis
funnels
integration
Analytics
Low converting
focus
Big volume
landing pages
& High loss pages
Forms analytics
Channel switches
Single channel
picture
Cross channel
synergy
+User Centered
Design
Layered feedback
Mini product tests
scores tied to UX
Rapid iterative
testing and
design
of customer
Driving offline
using online
All promotion
driven by testing
________________________________________________________________________
________________________
+ All channel view
+Regular usability
Analytics
+ Customer sat
Insight
methods
Surveys
Contact Centre
Low budget
usability
testing/research
Prototyping
Session replay
Onsite feedback
_________________________________________________________________________
Continual
79
Get buyin
Scale the testing
Mine value
Mission
_______________________ Prove ROI
improvement
80. 5 - Triage and Triangulation
“This is where the smarts of CRO are – in identifying the
easiest stuff to test or fix that will drive the largest uplift.”
•
•
•
•
•
•
•
•
•
•
Starts with the analytics data
Then UX and user journey walkthrough from SERPS -> key paths
Then back to analytics data for a whole range of reports:
Segmented reporting, Traffic sources, Device viewport and
browser, Platform (tablet, mobile, desktop) and many more
We use other tools or insight sources to help form hypotheses
We triangulate with other data where possible
We estimate the potential uplift of fixing/improving something
as well as the difficulty (time/resource/complexity/risk)
A simple quadrant shows the value clusters
We then WORK the highest and easiest scores by…
Turning every opportunity spotted into an OUTCOME
80
81. 5 - The Bucket Methodology
“Helps you to stream actions from the insights and prioritisation work.
Forces an action for every issue, a counter for every opportunity being lost.”
Test
Instrument
If there is an obvious opportunity to shift behaviour, expose insight or
increase conversion – this bucket is where you place stuff for testing. If
you have traffic and leakage, this is the bucket for that issue.
If an issue is placed in this bucket, it means we need to beef up the
analytics reporting. This can involve fixing, adding or improving tag or
event handling on the analytics configuration. We instrument both
structurally and for insight in the pain points we’ve found.
Hypothesise
This is where we’ve found a page, widget or process that’s just not working
well but we don’t see a clear single solution. Since we need to really shift
the behaviour at this crux point, we’ll brainstorm hypotheses. Driven by
evidence and data, we’ll create test plans to find the answers to the
questions and change the conversion or KPI figure in the desired direction.
Just Do It
JFDI (Just Do It) – is a bucket for issues where a fix is easy to identify or the
change is a no-brainer. Items marked with this flag can either be deployed
in a batch or as part of a controlled test. Stuff in here requires low effort
or are micro-opportunities to increase conversion and should be fixed.
Investigate
You need to do some testing with particular devices or need more
information to triangulate a problem you spotted. If an item is in this
bucket, you need to ask questions or do further digging. 81
82. 5 - Belron example – Funnel replacement
Final
prototype
Usability
issues left
Final changes
Release build
Legal review
kickoff
Instrument
analytics
Signoff
(Legal,
Mktng, CCC)
Test Plan
Marketing
review
Cust services
review kickoff
Instrument
Contact
Centre
Offline
tagging
QA testing
End-End
testing
Launch
90/10%
Go live 100%
Launch
50/50%
Monitor < 1
week
Launch
80/20%
Monitor
Analytics
review
Washup and
actions
New
hypotheses
New test
design
Rinse and
Repeat!
84. END SLIDES
Feel free to steal, re-use, appropriate or otherwise lift
stuff from this deck.
If it was useful to you – email me or tweet me and tell me
why – I’d be DELIGHTED to hear!
Cheers,
Craig.
84
Editor's Notes
“A piece of paper with your design mockup. A customer in a shop or bookstore. Their finger is their mouse, the paper their screen. Where would they click? Do they know what these labels mean? Do they see the major routes out of the page? Any barriers.Congratulations, you just got feedback on your design, before writing a single freaking line of code or asking your developers to keep changing stuff.”
This stuff is important. What do photographs do?Well they help me persuade people, influence their thinking, give them directions or cues and explain things – this is the scanning generation!And they’re very powerful when selling experiences, stories or using the power of social proofThey help people very quickly (more quickly than reading) discriminate, evaluate – work out what stuff is, how it’s organized, what the things are, what’s being shown to you.And most importantly, they drive emotional response in people. Whether you like being soggy, wet and without toilet paper for a 30 mile radius or not, a picture like this gets a RESPONSE! Work it!Lastly, a shout out to James Chudley, who’s book this example comes from.
So this is quite a powerful area – what about people images?I’ve tested quite a few of these – in over 20 countries and over 15 languages. What did I find?Well - the person, pose, eye gaze, facial expressions and body language – cause visceral emotional reactions and big changes in behaviour. The difference between a crappy image and one optimised to get the right response is huge.And one interesting thing Eye gaze is pretty crucial – to engage you, the viewer, or to ‘point’ or draw eye gaze and attention to a product. I’ve tested all angles of viewing and in these people images, the best view is straight at the viewer or slightly away. Any further and the conversion rate drops. It makes a difference I can count.
Add unique phone numbers to all your mobile sites and apps. That’s for starters.Then configure your analytics to collect data when people Click or Tap to make a phone call.Make sure you add other events like ringbacks, email, chat – any web forms or lead gen activity too.
Tomorrow - Go forth and kick their flabby low converting asses