[Paul] Welcome, and thanks for joining us for the second of two ALA TechSource sessions designed to help demystify the topic of web analytics. We hope to make this session as interactive as the last, because a lot of valuable information was shared and connections made from audience chat. Please help us continue to benefit from the wisdom and experience you bring to the sessions by making comments and posing questions during the presentation. Today, we ’ll be looking beyond the basics: highlighting three case studies of successful analytics use and going more in-depth into using and interpreting Google Analytics in the library environment. One case study runs throughout – Char’s experience using GA at Cal (UC Berkeley), and the two others have their own small sections, San Jose Public Library and Rutgers University.
[Paul] Briefly remind participants that we have corporeal forms.
[Char] Because Paul took the helm for much of the last presentation, Char will be the lead for #2, again due to our differing perspectives on the topic (mine is more data-oriented and techie, whereas Paul’s is more public service/big picture oriented). We are both going to be active in the session and help each other field questions during chat periods. Those with questions following the webinar can direct those questions about session 1 to Paul, and those with questions about session 2 can direct them to me.
[Char] Poll will run in WebEx while this slide is up. Answer choices: A. No frustration here B. Getting started with tracking C. Navigating the interface D. Data collection & analysis E. Implementation & action F. Other – tell us in chat!
[Char] I wanted to put up this slide again from the Waisberg and Kaushik article to remind us that the data produced by these programs allows assists you in the process of making iterative design changes to web interfaces and tools, and gives you ongoing insight into changes in your web user population. Not only can you use analytics to define goals for a site or web-based service, identifying “key performance indicators” of users, collect and analyze data, and make adjustments or major design changes to a site, you can use web analytics to understand the long-term traffic and user behavior that can help you balance the information received in other types of data input. This is one of the most important aspects of analytics tracking to me: to monitor changes in web use of library sites and services over the long haul.
[Char] Not to privilege page tagging to heavily, I wanted to include an image shoting a basic visualization of server logging analytics. This slide was taken from UC Berkeley’s own logfile analysis program “ Wusage, ” which runs in tandem with our Google Analytics accounts – the two don ’ t interfere with each other. The data produced here is basic, and it is also far easier to interpret and use than GA, which has a real learning curve: those who need simple insight into the number of users accessing different areas of the library ’ s site can come here as a first stop, which then provides an entryway to GA if more detailed information is needed. The figures here and in GA never match exactly, due to a number of factors such as page caching (which skews data in page logfile analytics) to cookies (which skews data in page tagging analytics).
[Paul] Now we’re going to switch gears and look at one of our two dedicated case studies. In her own library environment, Sarah Houghton-Jan, our colleague from San Jose Public Library and author of the Librarian in Black blog, starts with the basic idea of “using web analytics stats to review what's being used on the site and what isn't (including tracking marketing campaigns & their success/failure).” She wants to know “why something isn't being used if you think it should be, and figuring out a way to solve that problem through user interface design, navigation changes, naming,” and other means.
[Paul] For example, she and colleagues knew there was very little use of their library ’s research guides on their old site—“something like 200 different subject-based research guides that took about 20 hours a week to maintain.” They tried a variety of ways to increase usage, but “the numbers had dropped even further. It showed us it just wasn't something people wanted...so we cut it back from 200 to four critical guides for the new website.” It’s too soon to tell whether the drastic reduction make the remaining guide stand out more. Only the librarians in the central library had even noticed the reductions when we spoke, and there had not yet been an increase in use.
[Paul] When thinking about other elements of the library ’s web presence, Sarah suggested that “people are looking at social media numbers (Twitter, Facebook) but not at the numbers that matter. Followers/fans is part of it, but how many people do you lose? How many comments? How many re-tweets?... What about people checking in to your libraries on Foursquare, Gowalla, or Facebook CheckIn? What about yelp reviews? I would like to, but am admittedly not, tracking all of this.”
[Paul] “Views & visits of everything, pretty much, is where I would focus.” “ I tend to use simple bar and pie graphs, as well as scatter graphs, to show the visuals of our site trends over time, or compared to other resources...I do this for all of our databases and e-books, too. Also showing, for example, the visits to our catalog daily vs. the visits to our website proper daily vs. the visits to our King Library daily. It really shows people how ‘digital’ does count & does have a lot of activity…also pointing out the percentage of in-library use vs. out-of-library use--many librarians assume most library website/database use comes from inside their four walls--it's the opposite, usually 90-95% is remote use.
[Paul] In an ideal world, Sarah would “have an all-in-one integrated (and free/open source) system that managed all of our web presences in one place, so we don't have to go to 40 different sites to get the full picture.” “ I tend to use simple bar and pie graphs, as well as scatter graphs, to show the visuals of our site trends over time, or compared to other resources...I do this for all of our databases and e-books, too. Also showing, for example, the visits to our catalog daily vs. the visits to our website proper daily vs. the visits to our King Library daily. It really shows people how ‘digital’ does count & does have a lot of activity…also pointing out the percentage of in-library use vs. out-of-library use--many librarians assume most library website/database use comes from inside their four walls--it's the opposite, usually 90-95% is remote use.
[Char] Now we’re going to switch gears again and answer a few of your key questions from last week directly. I’ve identified four areas that take a bit of in-depth explanation to highlight here quickly, and have pointed those interested to further information and how-tos via the tinyURL at the bottom of each screen. Other questions you posed are going to be answered as we go with on-screen content, so if you don’t see your specific questions answered here and now sit tight and try to extrapolate from the slides and conversation to come. As I’ve said a couple of times, one reason analytics can be necessary is to challenge the assumptions you have about user behavior and understand known issues that affect the statistics coming into a pain analytics account. In this respect, GA profiles and filters are hugely helpful. One of the questions we received last time from ** inquired about bounce rate and tracking computer use in-library, which is a common point of interest among those with many dedicated patron computers. How much does in-library home page default skew our overall numbers? There are two possible solutions to understanding this issue that I can think of. You can create two separate profiles, each with a filter, one that tracks bounce rate and page depth in your library with IP filtering, and another that excludes library computers from overall tracking to avoid data skew. This way, you will understand the level of traffic as well as the in-library user’s response to the library’s page as a home destination.
[Char] Google recently updated the tracking code snippet to enable this far more easily, a move that this particular presenter is ashamed to say that she missed, and which will have implications for the location of the code snippet later in the session, which is now best achieved directly before the closing </head> tag. (One drawback of using page tagging, obviously, is that it keeps you on your toes. This change will require our tracking codes to be updated, which could have required a change on every page tracking analytics. We use a “server-side include” that enables us to change the code snippet in one location, and this will update the code (if not the placement) on every page that is tracked. Sorry for that aside, back to the question at hand. The new “asynchronous” snippet facilitates tracking code customizations that are all laid out in the Google Analytics help area, which the tinyURL points to in this case. This way, you can view and produce reports that track and connect an OPAC and a library website.
[Char] In .swf e-learning objects created using programs like Captivate and Camtasia, this is not very elegantly done based on the barriers thrown up by Flash itself, but it is nevertheless achievable with some caveats. You can embed GA tracking code within the .html file associated with every .swf, which will track each action screen within the tutorial as a separate, non-linked page. This gives you limited data, but taken on aggregate will help you understand how deeply users are interacting with each step of the tutorial, etc. I have not done this myself and am not sure how the asynch snippet (which loads faster on GA pages, by the way) affects it. Watch the tutorial by Paul Betty from Regis University that is linked to from the Distant Librarian article pointed to by the tinyURL
[Paul] Let ’s take a quick breather here and see if we’re on target. Using the live chat box on your screen, please let us know what caught your eye and ear during the last session.
[Char] Okay, so to recap some of the benefits of Google analytics and other page tagging programs. First of all, they (and GA specifically here, the dominant program) gives you a level of detail into your users that simply isn’t possible using page logging, such as their precise locations (which could be very useful when tracking distance learners or patrons or deciding where to open a new branch). This may seem big brothery, but it’s all in the use and interpretation. Be a responsible steward of GA information and your users will be well served.
[Paul] Another aspect of Google Analytics that is above and beyond is its depth of discovery information. It has an excellent capability of allowing you to see how people navigate to and discover your site through keywords, as well as the entrance points they are coming from. [Char] In this case, check out the different bounce rates of specific keywords. Those searching for the UC berkeley bookstore have a tremendously high bounce rate compared to those seeking “library Berkeley” . Those seeking OskiCat, our local catalog, have a high bounce rate because we ’ re not tracking our catalog yet: this could either mean they are jumping out to search again, or that they identified the OskiCat link on the home page and clicked through. Also, this shows us that there is a great deal of name recognition of our new redesigned catalog, which has only been around for about a year.
[Char] This is a screen showing you what I’ve been singing throughout this webinar: setting up profiles under each account that let you look at specific areas of the site, and specific things within those areas. The limit for profiles is 50, which I’ve maxed out on the GA account I manage. Here you can see that there are profiles set up for subject guides, library guides, instruction pages, and instruction pages with staff IPs excluded.
[Paul] Another feature of Google Analytics is its ability to help you visualize the charts and graphs that can be relatively overwhelming, which is the flip side of using visualizations to help others interpret and understand the data you are producing. Google analytics has excellent internal visualization tools, as well. This is an image of the new “page analytics” feature, which is still in beta – it’s essentially the same type of tracking provided by the old feature called “site overlay” but much improved and with additional capabilities, which Char will demo later in the session (time permitting). [Char] I’ve been playing around with it and it can be a little buggy on some filtered profiles, but I expect that it will continue improving into the future. You can access the new page analytics feature from the left-hand Content menu when you are viewing a report from within the Google Analytics dashboard.
[Char] I’ve been playing around with it and it can be a little buggy on some filtered profiles, but I expect that it will continue improving into the future. You can access the new page analytics feature from the left-hand Content menu when you are viewing a report from within the Google Analytics dashboard.
[Char] One of Google Analytics’ key features is its ability to download and email reports at the push of a button.
[Char] You can also set up automatic reporting, and email reporting using the procedure pictured in this screenshot.
[Paul] Another case study worth reviewing comes from Wei Fang, Digital Services Librarian at Rutgers-Newark Law Library for the Center of Law and Justice ( “Using Google Analytics for Improving Library Website Content and Design: A Case Study; available at http://www.webpages.uidaho.edu/~mbolin/fang.htm). Here’s the set-up: The library is part of Rutgers School of Law-Newark, had more than half a million volumes at the time the article was written, and has a primary mission of serving “the educational and research needs of the faculty and students of the Rutgers University School of Law. Using Google Analytics, staff wanted to see what could be learned from keyword comparison, visualized summaries, trend reporting, defined funnel navigation (a way of determining whether users followed the paths staff designed for them to facilitate their searches), visitor segmentation, and other tools.
[Paul] The author reports that the visualized summaries feature was “what we liked the most.” It included “80 predefined visualized reports that explain complex statistical data in a simple and easy-to-understand manner.” They were able to see a summary of website activities for the current week so they knew “how many visitors had visited our website, how many pages they had viewed, how many of them were new or returning visitors, where they were coming from and which website or search engine had referred them to our website.”
[Paul] Because navigation “is a major part of the user experience on the web,” staff used that Defined Funnel Navigation tool to see which paths were being used as expected and which were not. They used Content by Title to obtain “a list of the most popular items on our website.” And Visitor Segmentation allowed them to drill down into reports including information “such as country, region, and keyword, to generate a new report that presents visitors’ detailed information.”
[Paul] Among their findings: “ Though about 85% of visitors used high-speed internet connections…15% of visitors still used dial-up or other low-speed connections”—which, of course helped them understand what users faced in trying to use the library’s site. “ 85% of visitors used Internet Explorer as their browser, and about 11% used Firefox”—which reminded them of the importance of seeing how their pages displayed in each of those browsers. They also learned more about the screen resolutions their users had, which menu on their main website were attracting attention, and how other features were or were not being used so they could reallocate space and work on better designs to serve users ’ needs and meet their preferences. The article is well worth reading if you want to see how another organization kept its costs down and its results up.
[Char] Given these two case studies we’re covered so far, was analytics necessary for gaining the insight each organization sought? Could this have been accomplished another way? Share your thoughts in the chat area before we move on to our next section.
How Libraries Analyze and Act Part II
At what point do you become frustrated by analytics?
Q. “All library computers open to the library home page, which skews data. Will bounce rate help me understand who is staying/going?” http://tinyurl.com/googleanalytics-ip A. Yes, with another step or two. Create a new profile for library computers only using IP filtering to track bounces locally, and/or filter local IPs from the main GA tracking profile.
Q. “How can we connect metrics for the website and catalog which are actually separate domains, so that we don't lose the trail of users who move from one to another?” http://tinyurl.com/googleanalytics-crossdomain A. You can achieve this with tracking code customizations.