• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Web_Analytics_Part2--Analyzing_and_Acting--1-27-2011
 

Web_Analytics_Part2--Analyzing_and_Acting--1-27-2011

on

  • 833 views

Second of two web analytics presentations by Char Booth and Paul Signorelli for ALA TechSource. Presentation delivered as a live webinar on January 27, 2011; part 1 on January 20, 2011. Part two ...

Second of two web analytics presentations by Char Booth and Paul Signorelli for ALA TechSource. Presentation delivered as a live webinar on January 27, 2011; part 1 on January 20, 2011. Part two focuses on three case studies and the specifics of Google Analytics. Script included in speaker notes; for more information, please visit http://www.alatechsource.org/blog/2011/01/continuing-the-conversation-library-analytics-session-2.html.

Statistics

Views

Total Views
833
Views on SlideShare
833
Embed Views
0

Actions

Likes
0
Downloads
8
Comments
0

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

CC Attribution License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Welcome, and thanks for joining us for the second of two ALA TechSource sessions designed to help demystify the topic of web analytics. We hope to make this session as interactive as the last, because a lot of valuable information was shared and connections made from audience chat. Please help us continue to benefit from the wisdom and experience you bring to the sessions by making comments and posing questions during the presentation. Today, we’ll be looking beyond the basics: highlighting three case studies of successful analytics use and going more in-depth into using and interpreting Google Analytics in the library environment. One case study runs throughout – Char’s experience using GA at Cal (UC Berkeley), and the two others have their own small sections, San Jose Public Library and Rutgers University.
  • Because Paul took the helm for much of the last presentation, Char will be the lead for #2, again due to our differing perspectives on the topic (mine is more data-oriented and techie, whereas Paul’s is more public service/big picture oriented). We are both going to be active in the session and help each other field questions during chat periods. Those with questions following the webinar can direct those questions about session 1 to Paul, and those with questions about session 2 can direct them to me.
  • Answer choices: A. No frustration here B. Getting started with tracking C. Navigating the interface D. Data collection & analysis E. Implementation & action  F. Other – tell us in chat!
  • I wanted to put up this slide again from the Waisberg and Kaushik article to remind us that the data produced by these programs assists you in the process of making iterative design changes to web interfaces and tools, and gives you ongoing insight into changes in your web user population. Not only can you use analytics to define goals for a site or web-based service, identifying “key performance indicators” of users, collect and analyze data, and make adjustments or major design changes to a site, you can use web analytics to understand the long-term traffic and user behavior that can help you balance the information received in other types of data input. This is one of the most important aspects of analytics tracking to me: to monitor changes in web use of library sites and services over the long haul.
  • Just to recap again, logfile analysis and page tagging are the two dominant modes of web analytics. from server logs: most basic analytics strategy, to JavaScript page tagging, which is the method used by Google analytics.
  • Not to privilege page-tagging too heavily, I wanted to include an image showing a basic visualization of server logging analytics. This slide was taken from UC Berkeley’s own logfile analysis program “ Wusage, ” which runs in tandem with our Google Analytics accounts – the two don ’ t interfere with each other. The data produced here is basic, and it is also far easier to interpret and use than GA, which has a real learning curve: those who need simple insight into the number of users accessing different areas of the library ’ s site can come here as a first stop, which then provides an entryway to GA if more detailed information is needed. The figures here and in GA never match exactly, due to a number of factors such as page caching (which skews data in page logfile analytics) to cookies (which skews data in page tagging analytics).
  • Now we’re going to switch gears and look at one of our two dedicated case studies. In her own library environment, Sarah Houghton-Jan, our colleague from San Jose Public Library and author of the Librarian in Black blog [http://librarianinblack.net], starts with the basic idea of “using web analytics stats to review what's being used on the site and what isn't (including tracking marketing campaigns & their success/failure).” She wants to know “why something isn't being used if you think it should be, and figuring out a way to solve that problem through user interface design, navigation changes, naming,” and other means.
  • For example, she and colleagues knew there was very little use of their library ’s research guides on their old site—“something like 200 different subject-based research guides that took about 20 hours a week to maintain.” They tried a variety of ways to increase usage, but “the numbers had dropped even further. It showed us it just wasn't something people wanted...so we cut it back from 200 to four critical guides for the new website.” It’s too soon to tell whether the drastic reduction make the remaining guide stand out more. Only the librarians in the central library had even noticed the reductions when we spoke, and there had not yet been an increase in use.
  • When thinking about other elements of the library ’s web presence, Sarah suggested that “people are looking at social media numbers (Twitter, Facebook) but not at the numbers that matter. Followers/fans is part of it, but how many people do you lose? How many comments? How many re-tweets?... What about people checking in to your libraries on Foursquare, Gowalla, or Facebook CheckIn? What about yelp reviews? I would like to, but am admittedly not, tracking all of this.”
  • Still quoting Sarah Houghton-Jan: “Views & visits of everything, pretty much, is where I would focus…   “ I tend to use simple bar and pie graphs, as well as scatter graphs, to show the visuals of our site trends over time, or compared to other resources...I do this for all of our databases and e-books, too. Also showing, for example, the visits to our catalog daily vs. the visits to our website proper daily vs. the visits to our King Library daily. It really shows people how ‘digital’ does count & does have a lot of activity…also pointing out the percentage of in-library use vs. out-of-library use--many librarians assume most library website/database use comes from inside their four walls--it's the opposite, usually 90-95% is remote use.”
  • In an ideal world, Sarah would “have an all-in-one integrated (and free/open source) system that managed all of our web presences in one place, so we don't have to go to 40 different sites to get the full picture.”  
  • Now we’re going to switch gears again and answer a few of your key questions from last week directly. I’ve identified four areas that take a bit of in-depth explanation to highlight here quickly, and have pointed those interested to further information and how-tos via the tinyURL at the bottom of each screen. Other questions you posed are going to be answered as we go with on-screen content, so if you don’t see your specific questions answered here and now sit tight and try to extrapolate from the slides and conversation to come. As I’ve said a couple of times, one reason analytics can be necessary is to challenge the assumptions you have about user behavior and understand known issues that affect the statistics coming into a pain analytics account. In this respect, GA profiles and filters are hugely helpful. One of the questions we received last time was about bounce rates and tracking computer use in-library, which is a common point of interest among those with many dedicated patron computers. How much does in-library home page default skew our overall numbers? There are two possible solutions to understanding this issue that I can think of. You can create two separate profiles, each with a filter, one that tracks bounce rate and page depth in your library with IP filtering, and another that excludes library computers from overall tracking to avoid data skew. This way, you will understand the level of traffic as well as the in-library user’s response to the library’s page as a home destination.
  • Good question. There is a method of achieving dynamic IP filtering by using JavaScript to set a cookie on each local machine to include or exclude local IPs, which I gather is a relatively laborious process but works well once you have established it. This filters local traffic similarly to static IP tracking.
  • Google recently updated the tracking code snippet to enable this far more easily, a move that this particular presenter is ashamed to say that she missed, and which will have implications for the location of the code snippet later in the session, which is now best achieved directly before the closing tag. (One drawback of using page tagging, obviously, is that it keeps you on your toes. This change will require our tracking codes to be updated, which could have required a change on every page tracking analytics. We use a “server-side include” that enables us to change the code snippet in one location, and this will update the code--if not the placement--on every page that is tracked. ) Sorry for that aside, back to the question at hand. The new “asynchronous” snippet facilitates tracking code customizations that are all laid out in the Google Analytics help area, which the tinyURL points to in this case. This way, you can view and produce reports that track and connect an OPAC and a library website.
  • In .swf e-learning objects created using programs like Captivate and Camtasia, this is not very elegantly done based on the barriers thrown up by Flash itself, but it is nevertheless achievable with some caveats. You can embed GA tracking code within the .html file associated with every .swf, which will track each action screen within the tutorial as a separate, non-linked page. This gives you limited data, but taken on aggregate will help you understand how deeply users are interacting with each step of the tutorial, etc. I have not done this myself and am not sure how the asynch snippet (which loads faster on GA pages, by the way) affects it. Watch the tutorial by Paul Betty from Regis University that is linked to from the Distant Librarian article pointed to by the tinyURL
  • Let ’s take a quick breather here and see if we’re on target. Using the live chat box on your screen, please let us know what caught your eye and ear during the last session.
  • To recap some of the benefits of Google analytics and other page tagging programs: First of all, they (and GA specifically here, the dominant program) give you a level of detail into your users that simply isn’t possible using page logging, such as their precise locations (which could be very useful when tracking distance learners or patrons or deciding where to open a new branch). This may seem big brothery, but it’s all in the use and interpretation. Be a responsible steward of GA information and your users will be well served.
  • Another aspect of Google Analytics that is above and beyond is its depth of discovery information. It has an excellent capability of allowing you to see how people navigate to and discover your site through keywords, as well as the entrance points they are coming from. In this case, check out the different bounce rates of specific keywords. Those searching for the UC Berkeley bookstore have a tremendously high bounce rate compared to those seeking “library Berkeley” . Those seeking OskiCat, our local catalog, have a high bounce rate because we ’ re not tracking our catalog yet: this could either mean they are jumping out to search again, or that they identified the OskiCat link on the home page and clicked through. Also, this shows us that there is a great deal of name recognition of our new redesigned catalog, which has only been around for about a year.
  • This is a screen showing you what I’ve been singing throughout this webinar: setting up profiles under each account that let you look at specific areas of the site, and specific things within those areas. The limit for profiles is 50, which I’ve maxed out on the GA account I manage. Here you can see that there are profiles set up for subject guides, library guides, instruction pages, and instruction pages with staff IPs excluded.
  • Another feature of Google Analytics is its ability to help you visualize the charts and graphs that can be relatively overwhelming, which is the flip side of using visualizations to help others interpret and understand the data you are producing. Google analytics has excellent internal visualization tools, as well. This is an image of the new “page analytics” feature, which is still in beta – it’s essentially the same type of tracking provided by the old feature called “site overlay” but much improved and with additional capabilities, which Char will demo later in the session (time permitting).
  • I’ve been playing around with it and it can be a little buggy on some filtered profiles, but I expect that it will continue improving into the future. You can access the new page analytics feature from the left-hand Content menu when you are viewing a report from within the Google Analytics dashboard.
  • One of Google Analytics’ key features is its ability to download and email reports at the push of a button .
  • You can also set up automatic reporting, and email reporting using the procedure pictured in this screenshot.
  • Another case study worth reviewing comes from Wei Fang, Digital Services Librarian at Rutgers-Newark Law Library for the Center of Law and Justice ( “Using Google Analytics for Improving Library Website Content and Design: A Case Study; available at http://www.webpages.uidaho.edu/~mbolin/fang.htm ). Here’s the set-up: The library is part of Rutgers School of Law-Newark, had more than half a million volumes at the time the article was written, and has a primary mission of serving “the educational and research needs of the faculty and students of the Rutgers University School of Law. Using Google Analytics, staff wanted to see what could be learned from keyword comparison, visualized summaries, trend reporting, defined funnel navigation (a way of determining whether users followed the paths staff designed for them to facilitate their searches), visitor segmentation, and other tools.
  • The author reports that the visualized summaries feature was “what we liked the most.” It included “80 predefined visualized reports that explain complex statistical data in a simple and easy-to-understand manner.” They were able to see a summary of website activities for the current week so they knew “how many visitors had visited our website, how many pages they had viewed, how many of them were new or returning visitors, where they were coming from and which website or search engine had referred them to our website.”
  • Because navigation “is a major part of the user experience on the web,” staff used that Defined Funnel Navigation tool to see which paths were being used as expected and which were not. They used Content by Title to obtain “a list of the most popular items on our website.” And Visitor Segmentation allowed them to drill down into reports including information “such as country, region, and keyword, to generate a new report that presents visitors’ detailed information.”
  • Among their findings: “ Though about 85% of visitors used high-speed internet connections…15% of visitors still used dial-up or other low-speed connections”—which, of course helped them understand what users faced in trying to use the library’s site. “ 85% of visitors used Internet Explorer as their browser, and about 11% used Firefox”—which reminded them of the importance of seeing how their pages displayed in each of those browsers. They also learned more about the screen resolutions their users had, which menu on their main website were attracting attention, and how other features were or were not being used so they could reallocate space and work on better designs to serve users ’ needs and meet their preferences. The article is well worth reading if you want to see how another organization kept its costs down and its results up.
  • Given these two case studies we’re covered so far, was analytics necessary for gaining the insight each organization sought? Could this have been accomplished another way? Share your thoughts in the chat area before we move on to our next section.
  • Exploring analytics jargon, part 1 .
  • Exploring analytics jargon, part 2.
  • Jacqueline Lichtman, Web/Marketing Librarian at the Jefferson-Madison Regional Library ,provided an interesting bounce rate comparison for us that gets to the heart of what insight this particular statistic provides. First consider the bounce rate of their main site.
  • Bounce rate comparison: their OPAC bounce rate is much lower.
  • Now we’re getting into what many of you asked for last week: the specific demos of how to do specific tasks and navigate GA to get the most out of the interface. Logging in takes a Google account, which can be set up with a library email address.
  • Creating an account is a fast and easy process .
  • Adding the tracking code can be difficult to do at first, but once you achieve it a few times you will get in a rhythm of locating, copying, and pasting the code. The tracking snipped itself is located in the “edit > check” section of each profile in the account overview.
  • Using the new asynchronous snippet, you copy/paste code before the closing tag in any html editor.
  • There are alternative procedures for how and where to add the code in content management systems, but usually the answers are easily found in program documentation or user forums.
  • Let’s explore the GA dashboard, which is an interface that causes people a lot of confusion. Here, I’ve labeled the eight most frequently used parts of the interface, and I’ll go through them one by one.
  • Let ’s take our last break here for your questions and your comments. Again, using the live chat box on your screen, please let us know what you’ve been doing with Google Analytics or any other web analytics tools you’ve used before we move on to our final work today: live demonstrations of some of the things we’ve been discussing. And if you have posted a question we haven’t answered, please bring it back to our attention now.
  • Returning to the example we used in the previous session--that of tracking the mobile user shift using analytics--Char will jump into an analytics account and show you where and how to view this type of information using the left-hand dashboard menu in the visitors section.
  • Remember the scenario outlined in the last session, wherein Google Analytics showed us how mobile devices are increasing in use in this particular library?
  • Here’s an example of how to gather even more useful information about the mobile shift: comparing two years of use via mobile devices. Let’s look at this example of a pdf report generated form the GA dashboard. Later, during the live demonstration, we’ll show you how we found this information, and will go more in-depth into understanding how to create and interpret Analytics reports . Notice that there was very little mobile activity prior to midway through 2009, then a steady rise after that. Notice also the change in device use – there is huge growth in iPad access.
  • One GA meta-account should be created to rule them all. Use profiles and filters or tracking code modifications rather than multiple accounts to track different parts of your site – this makes it harder for the data to interact and produce reliable results.
  • Before Char starts in with her demo, this is a reminder that the help articles, videos, glossary, and forums in Google Analytics are a must-use for those who need to learn specific actions, and/or individuals just getting started with the tracking and analysis process.
  • As we did last week, we want to leave you with a few resources in case you want to dive in further. The WebAnalyticsLand site, shown in the upper right-hand corner of this slide, is one we previously recommended, and it has plenty that is applicable to what we explored today. Brian Clifton’s fabulous book Advanced Web Metrics with Google Analytics—now available in a second edition and shown here in the lower left-hand corner—is full of the sort of hands-on information you requested and which we’ve led you through today. And, to give you an ongoing fresh source of information, Clifton’s “Measuring Success,” which is billed as the “official blog for his book, should keep all of us going for months to come. We ’ ll leave these up for you while we answer any remaining questions you have.

Web_Analytics_Part2--Analyzing_and_Acting--1-27-2011 Web_Analytics_Part2--Analyzing_and_Acting--1-27-2011 Presentation Transcript

  •  
  •  
  •  
  • At what point do you become frustrated by analytics?
  •  
  • or… “ logfile analysis ” “ page tagging”
  •  
  •  
  •  
  •  
  •  
  •  
  • Q. “All library computers open to the library home page, which skews data. Will bounce rate help me understand who is staying/going?” http://tinyurl.com/googleanalytics-ip A. Yes, with another step or two. Create a new profile for library computers only using IP filtering to track bounces locally, and/or filter local IPs from the main GA tracking profile.
  • Q. “Char, will that filter work if your college uses dynamic IP addresses?” A. Not exactly – this is made possible by using JavaScript to set a cookie on your internal computers. http://tinyurl.com/googleanalytics-dynamicip
  • Q. “How can we connect metrics for the website and catalog which are actually separate domains, so that we don't lose the trail of users who move from one to another?” http://tinyurl.com/googleanalytics-crossdomain A. You can achieve this with tracking code customizations.
  • Q. “I'm curious about more on tutorial tracking, as well. When tutorials are made with a content authoring software (i.e. Captivate) is it possible to track beyond the first page?” http://tinyurl.com/googleanalytics-captivate (Distant Librarian & Paul Betty) A. Yes. You can embed a JavaScript call in a .swf’s .html file.
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  • BOUNCE RATE REFERRAL SESSION UNIQUE VISITOR PROFILE VISITOR
  • GOAL PAGE DEPTH CONVERSION NAVIGATION REFERRER FILTER
  • BOUNCE RATE COMPARISON: Jefferson-Madison Regional Library
  • BOUNCE RATE COMPARISON: Jefferson-Madison Regional Library
  •  
  •  
  •  
  •  
  •  
  • 7 8 6 2 5 1 3 4
  •  
  • 2 3
  • 4 5
  • 7 6
  • 7 8 8
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  • http://webanalyticsland.com/web-analytics-books/ http://www.advanced-web-metrics.com/blog/
  • Char Booth [email_address] blog: info-mational - infomational.com @charbooth Paul Signorelli & Associates San Francisco, CA 415.681.5224 [email_address] paulsignorelli.com
  • Empty Shelves: From Libraryonthemove ’s photostream at http://www.flickr.com/photos/libraryonthemove/431950822/sizes/m/in/photostream/ Library Users on Computers: From Pobrecito33 ’s photostream at http://www.flickr.com/photos/38117207@N03/4349654044/sizes/m/in/photostream/ Minneapolis Public Library Interior: From Wikimedia Commons at http://commons.wikimedia.org/wiki/File:Minneapolis_Public_Library_interior.jpg Laptop: From Yum9me ’s photostream at http://www.flickr.com/photos/yum9me/3135039820/sizes/m/in/photostream/ Globe: From Cheesy42 ’s photostream at http://www.flickr.com/photos/cheesy42/4431725778/sizes/o/in/photostream/ Rutgers Law Library - Newark: From Rutgers University Libraries website at http://www.libraries.rutgers.edu/rul/libs/law_newark_lib/law_newark.shtml Colored Folders: From Asparina ’s photostream at http://www.flickr.com/photos/honey_to_the_bee/443363039/sizes/m/in/photostream/ Stack of Reports: From Kevin H. ’s photostream at http://www.flickr.com/photos/kevharb/3056726319/sizes/m/in/photostream/ Navigation Tool: From Marfis75 ’s photostream at http://www.flickr.com/photos/marfis75/5374308475/sizes/m/in/photostream/ Magnifying Glass: From Arnybo ’s photostream at http://www.flickr.com/photos/arnybo/2679622216/sizes/m/in/photostream/ Target: From Ntang ’s photostream at http://www.flickr.com/photos/ntang/21736757 Rusty Grain Silos: photographed by Char Booth in west Texas. Images taken from flickr.com unless otherwise noted