The focus on optimisation sets an interesting constraint on the use of web analytics – you can optimise a process to improve efficiency, or reduce errors. But the implication here is that web analytics doesn’t have much to offer the design process, and I think there are significant opportunities.
We are shaped by the tools we use habitually. Its good for us as practioners in a range of disciplines to be able to draw on different disciplines. And with the current generation of tools like google analytics means you don’t have to be a specialist to get depth and value.
London Heathrow Airport Terminal 5 forecast that future travellers would be older. Research into older travellers showed they often go into the toilet, so many new toilets were planned. !However, deeper investigation discovered they were going into the toilets….to hear the announcements. It was the only place they could find where they could clearly hear the flight calls! So now the airport is putting new audio areas where you can clearly hear your flight call…. (credit Clive Grinyer)
Shopping cart abandonment can be easily measured. Shopping cart abandonment is a useful metric from a business perspective – it measures sales not made today. But problems arise when it is used to describe the customer experience – and when conclusions like this one are drawn So abandonment as it is framed here is a technical concept that originates from the mental model that the business has of its process. It doesn’t effectively describe the customer experience, and actually it can serious constrain thinking from a customer-centric perspective.http://www.shopsafe.co.uk/news/online-shopping-cart-abandonment-analysed-by-royal-mail/10098
http://econsultancy.com/blog/6075-checkout-abandonment-on-the-riseSo of course it easy to see that customer behaviors are significantly more complex than the headlines suggest – as per this recent research from Forrester. So 3 out of 5 of these reasons don’t represent abandonment at all – they represent real opportunites to create an ongoing relationship with the customer.
The fact that we continue to use these business-objective metrics to characterise customer behaviour is potentially a real missed opportunity.
Case study for previous iteration of the design: conversion rates had declined over the lifetime of the site – around a 2-year process. The sense was that the shopping basket and checkout were increasingly outdated and this was preventing completion of sales. A project was defined to re-design the checkout and basket and comparative usability study commissioned. Closer exploration of the available analytics didn’t really support the theory that the checkout was the problem area and the user study was widened. It actually turned out that the merchandising strategy for the site had also degraded over the lifecycle – customers just weren’t being presented effectively with product, and the checkout process was not the main issue.
May require additional relationships within the clients
A very challenging target audience of ‘at risk’ learners – older, generally low levels of previous education, generally lack confidence with computers, very often have other issues that are a barrier to engaging with education. A ‘launch early and listen’ strategy. An initial exploration of the analytics helped us define the targets for the research for this iteration, and helped define the demographics – so a very targeted cost-effective piece of user research. One of the main target areas was around use of video on the site. Less than 10% of visitors use the video – so significant pressure to reduce its use on the site. But those that did often watched several video segments. From the user research we identified specific customer segment – the core of our most vulnerable users that were motivated and engaged by video. And a model for what they got from the video and how they were likely to use that defined how to use video in the future.
Analytics isn’t enough – does not generate formative insights on its ownQualitative, observational user research generates formative insights, but is too expensive to do continually
Combining analytics and user research
Combining Analytics and User Research<br />Alex Tarling<br />User Experience Consultant<br />
About the session:<br />Why is it good to combine methodologies?<br />Why doesn’t this commonly happen already?<br />Some opportunities to combine analytics and user research…<br />… some case studies and some hints and tips to get started!<br />
Who am I?<br />Freelance user experience consultant<br />12 years experience of design research, UX, information architecture<br />Projects for Intel, BBC, Nokia, Orange, New Look, Blacks, Millets etc<br />http://linkedin.com/in/alextarling<br />
User Experience Research<br />“User experience research is a collection of tools designed to help you find the boundaries of peoples needs and abilities“- Mike Kuniavsky<br />
User Experience Research<br />“User experience research is a collection of tools designed to help you find the boundaries of peoples needs and abilities“- Mike Kuniavsky<br />”The field of user experience, is blessed (or cursed) with a very wide range of research methods“- Christian Rohrer<br />
Web Analytics<br />"Web Analytics is the measurement, collection, analysis and reporting of Internet data for the purposes of understanding and optimizing Web usage.“- The Official WAA Definition of Web Analytics<br />
Why use a combination of methods?<br />”When all you have is a hammer, everything looks like a nail”- Abraham Maslow<br />
Why use a combination of methods?<br />All methods have strengths and weaknesses<br />Combining methods with different attributes allows us to:<br />Triangulate between the strengths of individual methods<br />Mitigate the weaknesses and risks of each<br />
Why are user research and analytics methods not routinely combined?<br />Because user research and analytics are often commissioned and actioned by very different functions in the organisation.<br />Because user research and analytics often happen at different stages in the product cycle.<br />Analytics developed in a context to measure against business-goals, whereas user research is deployed as an aspect of the customer experience.<br />
Attributes of different methods:<br />Observational User Research<br />Primarily qualitative: provides insights about users’ goals, motivations and attitudes<br />We can explore context and opportunities, and what doesn’t happen as well as what does happen<br />But, small sample sizes and high cost means we end up with a snapshot in time, and specific demographics. Lab settings can also be problematic<br />So, observational findings open to challenge / multiple interpretation<br />Web Analytics<br />Quantitative and based in real-world data – talks about what is happening. <br />Large sample sizes mean high degrees of confidence<br />But, interpretation of behavioural aspects is hard without additional customer insight<br />Where we predefine measures, prior assumptions about meaning and significance can become entrenched<br />
Design of Terminal 5 - Risks of insight research…<br />
Quantitative methods also have down sides...<br />”Not everything that can be counted counts; and not everything that counts can be counted ”- Albert Einstein<br />
We have ‘abandonment’ issues…<br />www.shopsafe.co.uk/news/online-shopping-cart-abandonment-analysed-by-royal-mail/10098<br />
We have ‘abandonment’ issues…<br />www.shopsafe.co.uk/news/online-shopping-cart-abandonment-analysed-by-royal-mail/10098<br />… assumptions about the implications of ‘abandonment’ don’t talk to real-world customer experience…<br />
We have ‘abandonment’ issues…<br />(econsultancy.com/blog)<br />
We have ‘abandonment’ issues…<br />= Still missing the opportunity to genuinely explore and design for customer experience!<br />
Opportunity 1:<br />Use customer insight data to inform analytics measures<br />So that analytics reporting genuinely reflects and supports the customer experience, not just the business goals<br />
Opportunity 1: use customer insight data to inform analytics measures<br />Deliver actionable analytics metrics from your customer research insights<br />(because user research and analytics often happen at different stages in the product cycle)<br />Broker communication and collaboration across informational silos, and between phases in the product lifecycle<br />(because user research and analytics are often commissioned and actioned by very different depts.)<br />How to get started:<br />
Opportunity 2:<br />Use analytics data to drive the user research programme<br />
E-commerce redesign project:<br />Driving the user research programme from analytics data<br />
Opportunity 2: use analytics data to drive the user research programme<br />Just do it! <br />You can do a lot with even basic levels of analysis<br />Think laterally about the sources of data that are available<br />Commission specific analysis of existing data sources<br />How to get started:<br />
Opportunity 3:<br />Integrate analytics and user research to optimise the user experience throughout the product lifecycle<br />
The Open University <br />support for new distance learners:<br />
Opportunity 3: Integrate analytics and user research to optimise user experience throughout the product lifecycle<br />Ongoing use of analytics <br />to discover and target behaviors or interactions for the research programme<br />to ‘evidence’ user research insights<br />Rapid iterations of ‘fast’ user research interventions <br />Targeted against specific analytics findings<br />to generate insights and use-cases for development<br />Integration means adapting the product lifecycle to support continual innovation<br />‘Launch early and listen’ and agile strategies<br />How to get started:<br />
In summary<br />Opportunity 1: use customer insight data to inform analytics measures<br />Opportunity 2: use analytics data to drive the user research programme<br />Opportunity 3: integrate analytics and user research to optimise user experience throughout the product lifecycle<br />Thanks!<br />Alex.Tarling@gmail.com<br />