Fact or Question: Analytics for UX

3,175 views

Published on

Presentation from Mini UPA Boston 2011.
Analysis of online activity (analytics) has growing attention in the work of user experience design. How do we know when our findings are actionable answers and when they serve as refined questions begging for further exploration through qualitative user research? Can you really learn about your users through analytics?

This presentation will demystify web analytics, addressing common misconceptions. Through tips and examples of tactical applications of free tools and web analytics data in a ‘UX friendly’ context, this session will show that analytics can be an efficient tool for gaining rapid insight into user behavior and improving the value of our designs. We will explore ways to recognize findings that demand further investigation.

Published in: Technology, Design
1 Comment
5 Likes
Statistics
Notes
No Downloads
Views
Total views
3,175
On SlideShare
0
From Embeds
0
Number of Embeds
733
Actions
Shares
0
Downloads
0
Comments
1
Likes
5
Embeds 0
No embeds

No notes for slide
  • Here’s an example of why I love analytics, an actionable insight. We had a content strategy problem and a findability problem because of our use of an industry-specific term: ‘AD/HD’. Landmark College, where I direct web strategy, serves students with learning disabilities. When referring to attention disorders, we only ever use the term ‘AD/HD’. Discussions with students indicated many don’t identify as ADHD. ADD to the max, yes. But not hyperactive. Using the clinical AD/HD was a tone problem. But it was also a findability problem. People don’t search for ‘AD-slash-HD.’ Using google analytics, I could see that 80% of our traffic was branded search, meaning people who already know us and search for us by name. Branded search is ok, but that *much* branded search means you’re not reaching many new people. However, telling our President we had a problem because we’re seeing “80% branded search” would not have the impact I needed. So I took a look at Google Insights.\n
  • Maybe you've seen this--you can type in multiple search terms and see quantity of searches for each over time. Here's what I found. **‘ADD College’ is 3 times as frequent as ‘ADHD College’ and ‘AD-slash-HD College’ is rated a ZERO. The impact of this data was immediate. Our vice president took one look and let go of the requirement to use only ‘AD/HD.’\nThis is a case where data reveals trends that support decision-making.\nLet the data make your case, and egos become irrelevant. \n\nI’m a generalist, an in-house team of one. \nI rely on analytics for persuasion in a landscape rife with silos and competition.\n\n\n
  • These slides, and links to the people, tools, and resources I mention today can all be found at this link.\n
  • I spoke about Measuring UX at a recent conference and afterward asked Dana Chisnell, of Usability Works, for some feedback. She said she worries that people will think they’ve found an answer when they really have another question. I often hear things from skeptics like: Analytics gives us ambiguous data with dubious results. Analytics doesn’t take into account people's intentions or motivations. Analytics can’t tell us the all-important why behind people’s online behaviors. Even if analytics tells us what people are doing and not why, analytics offers plenty to inform evaluation and design.\n\nPhoto by Jason Crane\nhttp://www.flickr.com/photos/snapperwolf/4940044927/in/photostream/\n
  • I want to use our time today to demystify analytics: to give you examples of instances where analytics gives us actionable insights, and to practice recognizing the questions that remain. So what is analytics?\n
  • Here’s a definition put together from the writing of the wise Erin Richey and Avinash Kaushik:\nAnalytics is the analysis of cross-channel data for trends, patterns & anomalies toward continuous improvement of user experience and business success.\n\nWe may be working with clients and stakeholders on a particular project: a website feature or an app, but our clients need to manage it over time, to ensure they are supporting their bottom line, and to consider success within their engagement ecosystem.\n\nSo where do we get our cross-channel data? We’re going to look at 5 different types of sources.\n\nErin Richey: http://www.slideshare.net/erinjorichey/qualitative-quantitative-learn-more-about-your-users-with-web-analytics\nAvinash Kaushik: http://www.kaushik.net/avinash/2007/09/rethink-web-analytics-introducing-web-analytics-20.html\n
  • Clickstream Data offers information about behaviors: how people get to your site or your app and what do they do once they are there. When we do multivariate testing (or A/B testing) we are looking at where people click.\nSite Search Data is the data about people’s use of search ON your site, based on the site’s search logs.\nCompetitive Data is data about behavior on multiple sites.\nCustomer Questions and Comments come from places such as your call center logs, form submissions, and questions that come in through twitter, facebook, etc. \nSurveys in this case may be those auto-launch brief “Did you find what you were looking for types of surveys” or longer surveys you invite visitors or stakeholders to complete.\n\nhttp://thenounproject.com/\n
  • Analytics is amazing because of the quantity of data. Analytics can be overwhelming because of all that data. So where do you start? Well, not with any of these tools. You know the answer to this question. Where do we start?\n\nhttp://www.flickr.com/photos/emariephotos/4958245676\n
  • We start with the business goals. Talk to your internal stakeholders or clients: Ask them how they know a particular product or service is successful. What are they trying to accomplish? Be on the lookout for new ways to measure that success. While there are many common metrics (such as income from sales), every business and functional organization has unique metrics as well.\n\nhttp://www.flickr.com/photos/cobbo1/324276965/\n
  • Here’s an example of a unique metric..\nI was waiting for a meeting with our Admissions dept. to start and I heard them talking about the budget scrutiny they were under. You see, to build awareness of Landmark College, we invite school counselors, all expenses paid, to visit us: we offer a weekend of presentations and of VT tourist activities. It’s costly, and the only success metric the Admissions dept was looking at was the big conversion--when those school counselors recommend students apply to Landmark. But these relationships take time. I was excited to get back to my office to see if I had data I could give them in support of their initiative. I did.\nSchools very frequently name their networks after the school. Using Google Analytics’ ‘Service Providers’ report, I filtered for visits from networks containing the word ‘school’ and this is what I got--a list of the schools with people visiting our site. These visitors have a 4% conversion rate compared to .5% on the site overall. By conversion rate, in this case I mean people who requested information, downloaded a whitepaper, or signed up for an open house or other visit to the College.\nTwo of the school districts who sent counselors have over 40 visits this month, averaging over 17 minutes/visit. That’s impressive. And not only was marketing able to use this data to justify budget, but front-line Admissions staff are able to identify hot leads to do follow-up calls with and develop those relationships further. Giving people data that helps them prove their success--and doing so unasked for--is always a win.\nIt took some conversations with them to figure out the information display they could understand and use, but that’s a valuable part of the process.\nSurfacing this data also improved the experience of our visitors: I showed this data to the reluctant presenters, faculty and administrators, who frankly thought the program was just some crazy marketing initiative. When they saw how visitors stayed engaged after visiting us, they became more invested in the weekend time they give up for giving those presentations.\nThis clickstream data shows us a behavior and a demographic. People from Seattle School District are visiting our site for long periods of time. We could look at the paths individuals took through the site. We could look at the data for most common pages visited by those people. Where did they spend the most time? But if we want to know what motivates them and what they value, we need to talk with them, and look for patterns in their stories.\n
  • This table shows business goals, related metrics and the segments that would inform those metrics. Analytics data is rarely useful until you break it up into segments that have meaning. Audience segmentation is not new to us--we do this all the time. In analytics, segments may be based on demographics, technology used, but also on behavior segments.\nFor instance, if we’re looking at customer loyalty, we might want to measure repeat visits. The segment would be people who come to the site multiple times. We don’t want to just know how many people come to the site multiple times. We want to look at the behavior of that segment. What do these people do on your site or your app? \nThere are two very important columns missing from this table. Anyone know what they are? Hint: Both columns are things you’d need to get from client. Target numbers (what % of visits should be repeat visitors) and the dollar value of those conversions. \n
  • Target numbers (what % of visits should be repeat visitors) and the dollar value of those conversions. \nI asked four VPs what dollar value they would associate with the successes they’d identified. I did not expect the reaction I got: hugely positive reaction from two of the VPs and an immediate call for meetings from the two people responsible for the initiatives I was asking about. They were resistant and reactive to the threat of assessing their performance (and concerned that the numbers might be inaccurate). BUT. My request for numbers created a sea change in the amount people ask for my input and consultation--in their perceived value of my contributions.\nAsk your clients and stakeholders to identify value. It’s difficult for people and you’ll get resistance. (“How do I know the value of someone writing a positive review of my app?”) But identifying a number value helps in tracking success over time, and deciding where to put design and maintenance resources. You can even start with a 1-10 scale of value if people can’t come up with dollar figures.\nIt’s in the _conversations_ about business goals and success measures that we help our stakeholders build things they can care for over time.\nLynne Polischiuk, a UX consultant, advises we use her rule: never accept the answer ‘More’ for a target number. ‘We want more conversions.’ or ‘We want more repeat visitors’. Ask clients to identify a numeric value or goal (a 10% increase, a $5.00 increase, etc) to make success measurable.\n
  • You must segment your data--break out meaningful smaller sets of data--for it to be useful. For the same reasons we identify different audiences and personas as part of design--it’s in the disaggregation that we find actionable information.\nWhat are the useful segments for a given conversion? If you’re evaluating brand awareness, and measuring success of social referrals, you might want to look at people coming from shortened URLs. Or people coming from mobile devices. The canned reports show you a break-down by device. This is good information. Perhaps the client is pushing for a Blackberry app, but the data shows that only 7% of mobile visitors are on Blackberry. However, the canned reports only show you that overview look at ‘how many.’ **Analytics tools provide the means to explore the data for a given segment. What are the top pages or features used by people on mobile? Is it typical mobile activity such as maps and contact info? Or is it actual content about our services and products? \nSegmented data provides valuable information toward identifying solutions.\n
  • The segment of people who ‘converted’--took an action that’s good for your business--is a great segment to analyze. For the visitors who carried out your big successes: purchased something, signed up for your service--what search terms brought that segment there? What else did they do on your site before purchasing or registering? Segmenting clickstream data is essential to finding actionable insights. Consider creating a landing page for searches with those terms, linking to the content people viewed before purchasing or registering. If you work with marketing, these are successful search terms to invest in.\nIs this black hat marketing--keyword stuffing, or is it information that will help you give visitors the information they are looking for? Search data tells you what people are looking for in their own words. You can combine these successful search terms with the terms people searched for on your own site before converting. Jared Spool says the search box should be called BYOL ‘Bring your own link.’ \n
  • Your site search logs are a great source for popular & successful search terms, as well as common unsuccessful searches, including misspellings. Use this data to support better search success with best bets. This is an example from Peter Morville & Jeffrey Callender’s Search Patterns, in which Peter suggests: “best first is the most universal and important design pattern in search.”\nSite search analysis and fine-tuning are extremely important because generally, the rate of success after on-site search is very poor. \nhttp://www.amazon.com/Search-Patterns-Discovery-Peter-Morville/dp/0596802277/ref=sr_1_1?ie=UTF8&qid=1305298943&sr=8-1\nLet’s take a look at brand engagement on social.\n
  • Facebook offers pretty minimal free analysis of engagement with your pages. Here we’re seeing the number of new likes, all likes, and # of active users. Active users gets broken down into page views, post views, likes, comments, etc. However there’s very little in the way of comparison over time, in order to spot trends. \nIf we want to see what kinds of posts generated engagement, we’d be going on a treasure hunt through the posts.\nOf course, Facebook sells access to this data to 3rd parties, whom you can pay for more robust analysis.\n
  • Social Bakers is a Facebook Statistics service that offers trend data and analysis of engagement and post quality, key influencers. In this screen shot we see a graph of fan views of specific posts, with the actual posts showing on the right. This is an improvement over the free Facebook view. (With a free Social Bakers account, you can track one page.)\nDoes this kind of data tell us ‘why’ people are engaging? An affinity analysis, grouping types of posts, would tell us the kinds of posts that encourage response on our pages. This is useful information for planning content, even if we don’t have explicit information about why.\n
  • Social Bakers also offer tips regarding your post quality. Apparently this page needs some work. Whether our projects involve measuring brand engagement on Facebook or not, it’s helpful to be aware of the kind of data available to our clients. Many analytics tools make use of their vast stores of data to provide tips such as these. I can’t speak for the merit of Social Baker’s suggestions, but I’ve found Google AdWords campaign suggestions to be very successful. \nLet’s take a look at a glassblower’s website.\n
  • The glassblower identified that he wants to increase the number of people signing up for glassblowing lessons by 10%. There’s some investigation needed to address this goal. Are people finding glassblowing lessons? Do those who do know how to sign up?\nThere’s some search engine ranking and social media to evaluate to see if people are able to discover your site. \nBut once people are on the site:\nCan they find glassblowing lessons?If they do, do they sign up? Why or why not? Which of these questions can analytics data answer? \n
  • Here’s an overview of visits from people who came after searching the web for glassblowing lessons or classes.\nThey have a very low bounce rate--that is only 27% of these people leave without looking at a second page on the site.\n
  • One useful piece of data will be what do people who came to the site having searched a search engine for glassblowing lessons or classes do once they are on the site?\nThis is a screen shot of the ‘create custom segment’ screen in Google Analytics. I’ve set a filter of keywords that contain lessons or classes. Now I can look at what those people did on the site.\n
  • Here we have a view of the navigation summary for people who came to the site looking for lessons or classes. For the people who went to the glassblowing lessons page, what page did they visit before? After? Here’s a secret--using analytics is an evolution of testing various data views to find what’s useful. Some takeaways: the majority of people finding glassblowing lessons are coming from the home page, visit our studio, and the artist page. This makes sense--you’d want to find out more about the artist before signing up. Only 10% of people going to the glassblowing lessons page exit the site. They’re still interested to look around the site. It would be useful to compare the actual number of people who called or emailed or submitted a contact form about glassblowing lessons compared to the number of visits on the site from people looking for glassblowing lessons. This is what their 10% improvement will be based on.\n
  • This is a view of home page clicks from Crazy Egg. It shows where people who searched for glassblowing lessons clicked on the page. Analytics tools like Google Analytics tell you which destinations were selected, but if you have two links to the same destination (page), it won’t tell you which one of those links was clicked. Crazy Egg uses coordinates, and shows you right where the link was clicked.\nSo here we see that of the 10 people that came to the site looking for glassblowing lessons (it’s only 10 because this was a four-day sample), only two clicked on the glassblowing link. No one clicked on the Lessons and Rentals link. This is too small a sample to derive any conclusions from. In general, I find that heat maps can help make plain calls to action that aren’t getting any interest, but beyond that, there’s more to be gained from expert evaluations and usability testing. From a brief look at this data, I would guess that the best place to put attention to increase glassblowing lesson sign-ups is in keyword targeting and in local media.\n
  • This slide refers to ‘visits with events.’ In this case, events is referring to conversions I’ve identified, such as downloading an application or brochure. In our case, these are pdfs, which are not automatically tracked by analytics tools. PDFs, Flash movie plays, and links to external sites are not yet visible to clickstream tools. By adjusting your links to these events, you alert your clickstream tool. Here’s an example in Google Analytics.\n\nHere we have an HTML link with an ‘onclick’ event added. It tells Google Analytics to track the event, and we can choose some descriptors, or categories, of the events.\nWhen you track events, think about the categories you’ll want to group them in for reports. (Help, etc. -- not just downloads)\nI can use the labels (e.g. Freshman App vs. Transfer Student App) to get even more detail, but if I want to look at behavior patterns for all people who downloaded an app, or went to the online application, categories are very helpful.\nWhen I first set this up, I used two broad categories: Downloads and External Site. This turned out to be useless because I couldn’t differentiate between applications and whitepapers. So I had two weeks of not-so-useful data. No big deal--there’s a lot of trial and error in tweaking the data you gather.\nThe data became much more useful when I broke down these categories (Though I wasn’t consistent in capitalization, which matters.)\n\n
  • In his IASummit talk, A Practical Guide to Measuring the User Experience, Richard Dalton suggested: “Ask; how would the user behave if we nailed the design? How would they behave if we screwed up?”\nhttp://www.slideshare.net/mauvyrusset/a-practical-guide-to-measuring-user-experience\n
  • So here’s an electronic invitation site. Help me out: How would the visitor behave if we nailed the design? How would they behave if we screwed up?\ne.g. Creating multiple accounts\n
  • Here we’ve got my Events page. There three events that all take place on the same date and location. One event has 119 invites and the others each have one. I’d argue we don’t need to talk to someone to know their intent here. They did not want three instances of this event.\nWith an app, identify signs of problems. In this case, have trouble successfully creating an event. (Create an account, but never create an event. Multiple invites for the same event. Duplicate recipients.)\n
  • \n
  • If you’re working on a competitive analysis, compete.com is a great resource. You enter domains and get lists of the top search terms driving people to those sites. This example compares glassblower sites. We can see how many visits the sites get and top search terms bringing people to the sites, and what sites link to them. (For companies of significant size, branded search is often all you'll see in the free version of compete.com, which is not very useful in telling you what people are seeking on the competition's site. To address this, you can use the 'related searches' filter in Google. The search terms often give clues as to the most sought information on those sites. Are these problems you want to solve as well? Include the search terms in your competitive analysis. \n
  • In addition to the popularity rating we saw with ADHD College vs. ADD College, Google Insights offers related popular search terms, and search terms that are rising in popularity. These are also useful to include in a competitive analysis. What are people looking for? Do you want to address any of these needs in your products, services, or content?\n
  • Popular search terms also support work on personas and information architecture. In this example, I’ve used the Google Keywords Finder tool to find popular search terms related to asthma. You can dig deep into these search terms, cluster them, and connect them with the personas that have related tasks.\n
  • Here’s an A/B testing example. This one is for the bulk email tool, MailChimp. A/B testing is when you compare success rates for different designs. Often only one element of the design will be tested at a time, in order to limit the confounds in the test. You can do A/B testing of prototypes, elements on live sites including content. Here we can choose if we’re testing subject lines, from names, or delivery date/times. The MailChimp example is great because it makes instant use of available data. It sends one design to 10% of the mailing list and a second design to 10%. Whichever design is more successful is the one that gets sent to the other 80% of the list. Here we choose the basis for success: open rate or click rate.\n
  • Here’s a content strategy example:\nSay you need to decide where to put resources in terms of content development.\nReview the ratio of content to visits. **Is there a mis-match between what interests people and where you have content? What’s the most popular content on the site? What area can you prioritize with more good content? Look at your top most visited pages and search terms. Do you have a lot of content in under-utilized areas?\nThis information display has helped me demonstrate the intense need we have for writing program-specific resources. It has helped me demonstrate that ‘About’ has been a fall-back location for avoiding silo issues.\nThese are examples where analytics inform design and help people prioritize work.\n
  • I hope I’ve succeeded in demystifying analytics some for you, and in giving you a taste for the breadth of useful data at your fingertips. Are there any questions you’d like to ask?\n
  • \n
  • Fact or Question: Analytics for UX

    1. 1. Fact or Question? Analytics for User Experience Julie Strothman strottrot.com @strottrot
    2. 2. Search Term Usage on a scale of 1-100 ‘ADD College’: 62 ‘ADHD College’: 21‘AD/HD College’: 0
    3. 3. Slides, Links & References http://strottrot.com/u/7 @strottrot
    4. 4. “I worry that people will think they’ve found an answer when they really have another question.” - Dana Chisn
    5. 5. Demystifying Analytics
    6. 6. Analytics Analysis of cross-channel data for trends, patterns & anomalies toward continuous improvement of user experience & business success. @erinjo + @avinash
    7. 7. Clickstream Data Site Search Data Your search logs Competitive DataCustomer Questions Surveys Call center logs Form submissions
    8. 8. Where do I start?
    9. 9. Identify Business
    10. 10. Every business has unique metrics
    11. 11. Segment Data for More Actionable InformationBusiness Goal Metrics SegmentsCustomer loyalty Repeat visits Number of visits (e.g. 2, 3+) Time on feature People using the given featureBrand awareness Likes, mentions Visits from Facebook, Twitter Branded search Traffic Source: Search terms with brand name
    12. 12. Segment Data for More Actionable InformationBusinessGoal Metrics Segments Target # $ ValueCustomer Repeat Number of visits (e.g. 2,loyalty visits 3+) Time on People using the given feature featureBrand Likes, Visits from Facebook,awareness mentions Twitter Branded Traffic Source: Search search terms with brand name
    13. 13. Segment Data for More Actionable Information
    14. 14. Audience Segment: Visits with Conversions Search terms that brought people who ‘converted’
    15. 15. Use successful search terms to drive ‘best bets’ Best Bets — misspelled words
    16. 16. Goal: Brand Engagement
    17. 17. Goal: Brand Engagement
    18. 18. Goal: Brand Engagement
    19. 19. Business Goal: Sign more people up for glassblowing lessons
    20. 20. Overview: People who came looking for lessons
    21. 21. What do people who came looking for lessons orclasses do on the site?
    22. 22. Where did people who searched forglassblowing lessons click?
    23. 23. Tracking ‘Events’ (PDF downloads, movie plays) Category Action Label<a onclick="_gaq.push([_trackEvent, Apps, pdf, Freshmen App]);"href="../applications/landmark-application.pdf">Application & Checklist [PDF]</a> Useless categories! Better, but caps issue.
    24. 24. Identify Possible
    25. 25. Using Data
    26. 26. Competitive Analysis
    27. 27. Compete.com termsGoogle Insightsterms
    28. 28. Keyword Searches for Personas
    29. 29. Are you putting resources into the right content? % Content % Visits 35% 28% 20% 13% 5% About Academics Student Life Library Institute Alumni Credit: Tim Hart via Avinash Kaushik: http://strottrot.com/u/6
    30. 30. Are you putting resources into the right content? % Content % Visits 35% 28% 20% 13% 5% About Academics Student Life Library Institute Alumni Credit: Tim Hart via Avinash Kaushik: http://strottrot.com/u/6
    31. 31. Thank You!Slides, Links & References http://strottrot.com/u/7 Julie Strothman strottrot.com @strottrot

    ×