Your SlideShare is downloading. ×
Dlf 2012
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Saving this for later?

Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime - even offline.

Text the download link to your phone

Standard text messaging rates apply

Dlf 2012

508
views

Published on

Working Session on Assessment for DLF Fall Forum 2012 - Sherri Berger and Lisa Schiff

Working Session on Assessment for DLF Fall Forum 2012 - Sherri Berger and Lisa Schiff

Published in: Technology

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
508
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • I am not an assessment librarian, per se, but I am an analyst for our services. And in order for me to continue to inform the strategic direction for our websites I learned that I really had to start working within an assessment framework.Both Lisa and I have realized that this is becoming more and more a continual, integrated part of our work
  • Ask 2-3 participants why they are interested in this working sessionWhat are one or two major issues you see in assessment? Challenges you’ve encountered, things you wish you could do, questions you have, etc.Working definition of assessment, and the purposes
  • Please ask questions and provide remarks as we goFirst part is designed to help you think through your own projects, with an emphasis on asking questions up front that are going to help guide your assessmentSecond part gets into the data with some examples of very cool data collection techniques we’ve implemented for our collection and also some of the shortcomings we’ve discoveredFinally I’d really like to just open it up for discussion – going back to some of those questions we raised at the beginning and also if the session has piqued new ideas. What issues we see with assessment, ideas for building out some best practices and support structures
  • Assessment can best be though of in phases
  • So, what does the process look like? You might think of assessment as just the data gathering; but I’d argue it starts much earlier.Scoping is really where you do the groundwork up front that is going to get you the best possible data, and most efficiently analyze itIt can be tempting to get right into the data but in my experience this is well worth the effortScoping can happen at various levels: for example if you have a three part assessment you might be looking at major questions, and then sub-questions to explore with various methodologiesThis was absolutely essential for me when I started to look through Google Analytics and I had way more data than I could ever do with it; I needed to define very clearly what I was interested in learning and how it would advance our objectives.
  • This is where the real bulk of the work occursEverything from setting up your data collection method to looking closely at it and deciding what is missingVery cyclicalThis is where the scoping part really comes in handy, because it helps you separate the wheat from the chaff
  • I’m not going to spend really any time talking about this outcomes area – this isn’t because it’s not important, but rather that I found that this is so specific to the situation that I don’t know that there’s much I can impart that would be of broad interest. However I am happy to talk about this if there is interest.Regarding communication, I’ve also brought examples of the reports I’ve produced for various assessments and you’re free to browse these. Definitely an art form to synthesizing and presenting complicated data to an audience.
  • External factorsGuide your decision about what you need to know, whether assessment is necessary, etc.Really frame your assessment and remind you why you’re doing this
  • This obviously differs by field – in publishing there are alt metricsIn special collections there is an effort to understand this better, e.g. latest issue of RBM
  • I would encourage you to make your questions as discrete as possible
  • Try to dig down until you can get to the questions that are answerable by way of data gathering. Those are the questions that your assessment will be concerned with.Here, for example, we go from “discovery” to “visited” content—that’s because that is the metric that is answerable by stats. You are going from a more abstract concept to something more discretely answerable
  • Sometimes policies or resources may restrict our ability to gather what we needE.g. our You have to be a little flexible here
  • I’m sure you are all award of many different types of methodologies, possibly even have more experience with various types than I do.The important thing to me is that the methodology follows the needI can’t tell you how many times I’ve been on a committee and someone wants to learn something and they say “let’s do a survey!”Another misconception, in the other direction, is that web statistics will tell you a lot about your users. In fact they will tell you a lot about their BEHAVIORS, but nothing about their motivations or needs.
  • Another interface to OAC digital content330,000 digital objectsImages, texts (PDF, TEI)Target audience K-12 teachersTopical groupings with context~2,000-3,000 visits per day
  • So, why an assessment?When you think about spending less funding on this it gives Calisphere a bit of an identity crisisLists and lists of cool things to do, but not sure which to tackle first, and who those should serveFrom the administrative side, a need to break free of the original structure the site provides; for example, all this cool stuff on Melanesia but the homepage tells the story of CaliforniaContributions decreased in recent years and we could guess but we weren’t positive why
  • I mentioned the assessment drivers beforeThings were shifting and we had a lot of questions about our various constituencies
  • And, as you can see, there were some more discrete questions here.
  • And the determined methodologiesSo… let’s stop there for a few moments and take a few minutes to start thinking about the worksheetNow I’m going to go into the first two in a little more depth because I think there is a lot there that may be helpful from our experienceWe’ll focus on more of the quantitative data collection side – this is where I think there is a lot to be considered and figured out
  • Let’s do the worksheets here
  • “Under the hood” of the Calisphere assessmentSharing with you some of the unique processes we developed, especially with respect to statistics gathering – and some lessons learnedHow many of you have GA or some similar stats system on your digital library; how many of you have really gone in there and used it?
  • We knew anecdotally, but we suspected there were other audiences—needed dataHow to get it? No user accounts and a fairly strict privacy policyA one-question survey asking users “who are you”?Placed on almost every page of OAC and Calisphere“Popped up” on entrance, and only once for that user, for both sites, whether they answered or notMultiple choice, plus a write-in for “other”Time period: one week. November 30 – December 7, 2011Why we built it ourselves: one of the reasons was we could put Google Analytics code on the survey, so we could actually associate the answers with broader statistical trends. E.g. do teachers or college students more frequently use search engines to arrive at Calisphere? What is the average time a faculty researcher will spend on the OAC?We had a good sample sizeWe want to do this again to make sure the results are replicateable, see if there are “unusual” patterns
  • Had to work across two websitesHad to come up on the first page, and only on the first pageWanted it to be VERY visible, but not terribly intrusiveHad to conform to our privacy policy, or it had to be close enough that we could expand the policy
  • An EVENT is a user behavior that you want Google Analytics to track across the siteFor example, how many people click on a particular button that is located on several pagesIt is helpful to get aggregate data about a user action, instead of only on a particular page
  • So these are those same OAC “other” responses exported from GA, and weighted according to frequency.
  • There are two factors here:One is more of the behind-the-scenes work you need to do to get the data outputs you want – configuring dataThe second is the process of extracting what you need from this messSo there’s really a conversation here between the development staff and the analyst (of course if you’re the same person then you’re in good shape!)
  • There was a fair amount we had to do in advance of jumping into the statsWe had to get the stats on the site and have enough data to work withWe had to work through some weird technical issues about how this would actually work across our systems – probably atypicalI had to get up to speed on what was possibleThis is where the discrete questions are GOLDEN – there is SO MUCH info in there, and having these particular areas of inquiry helped me focus. I would write out little problem statements and then try to engineer actually getting that information.Making use of all of the little things you can engineer; obviously in close conversation with technical teamFinally: SITE KNOWLEDGE – this is definitely from the project manager’s perspective, but I learned the most about how our website is structured. You really have to have a firm understanding of
  • Here’s a great example of something we implementedAs I mentioned before, we assigned events to the user poll dataThis enabled us to see each of the categories discretely in GABut it also gave us a hook with which to SEGMENT the usersSegments are ways of limiting the data to users that meet a particular set of requirements, e.g. “entered at the homepage”In this case we were able to segment by things like “OAC users who called themselves faculty”= user data integrated into GA= user types matched to behaviors!
  • The MAGIC is the bringing of the two data sets together
  • I actually made a worksheet for myself to try to figure this outThe fancier ones are going to potentially require you to export and manipulate the data
  • Also there were times I got stuck and would actually write out my question, then match that question with the elements I needed to combine in order to get what I neededThis specific example is probably a lot of gobbledy gook to you, but hopefully the idea is clear which is that you can parse your questions into their corresponding analytic components
  • Obviously a tremendous amount of info out thereGoogle Analytics documentation not the easiest, but it worksIf you want more of a methodical introduction to web analytics and how it fits into the bigger picture, and what you can do with them, this is the best overview resource I foundHe also has a blog
  • I want to share with you some issues to look out for when it comes to analyzing and interpreting your dataSpecifically with respect to usage stats and polling dataThis will also set up our discussion about what we as a community can do to support and improve assessment for digital libraries
  • A lot of people asked us: was this a typical week? How do you know there wasn’t a major class assignment for college students?Now we’ve figured out a way to put the poll on the site indefinitely so we get rolling data in about thisWe are also experimenting with Bayesian statistics to calculate the percentage
  • So I just got finished evangelizing Google AnalyticsBut there are some issues to look out for with respect to usage statistics and analytics software(and this could apply to any stats gathering software)First of all, they tell us NOTHING about user motivationsThis is my man AvinashKaushik - He’s got a great book called Web Analytics and Hour a Day, and a blogThis was really my mantraFiguring out where we could make cognitive leaps because we know certain things about our users’ needs…and where we had to realize that we just don’t know why they are behaving in a certain wayAlso good for external people who view your data: don’t jump to conclusions and don’t confuse the “why” with the “what”
  • This chart is trying to show me the distribution of page views across our pagesThe only thing it really does is visualize the long tailThat giant grey area is the “other” pages that are each viewed less than .2% of the timeThey do seem to be improving GA all the timeFor example now there is a plugin to Google Spreadsheets that lets you do a lot of data munging and visualizations without needing to export everything into Excel
  • COUNTER compliance?PIRUS = repositories share their counter compliant data; pull together usage stats for distributed versions of a publicationMy thesis: usage stats are an amazing source of information for understanding user behaviors and making key decisions about the design and features for your digital libraryI’m not convinced that they are yet an effective and/or useful (set of) metrics for demonstrating value/impactSo we get 2-3K visitors a day. Is that… average?There is supposed to be a GA benchmarking newsletter – I can’t figure out how to receive it, or maybe they really aren’t putting it out anymoreEven if there WERE benchmarks, it’s hard to know what our industry would beIn mission, goals, and users, we are closest to “higher ed”But in site structure we are closer to eCommerce e.g. stock photo siteThis is where I feel like the library community could do some work to share our data and come up with some standards or at least a vague idea of what success looks like; similar issues with alt metrics
  • Use earlier questions to startAdd some more nowPrompts if we need them:What metrics can digital libraries use to define impact?Are the arguments and data around impact different for different constituencies? What are the best ways to benchmark our data? Can we share it somehow?How do we balance gathering user data with respecting privacy?How can we support content contributors in demonstrating their impact?How do we balance assessment and data collection with agile development practices?
  • Transcript

    • 1. Assessment 360Evaluating and planning for the future of a digital library project DLF Fall Forum 2012: Denver, CO Sherri Berger & Lisa Schiff
    • 2. Why Assessment?The process of collecting and analyzing data aboutand/or around the service in order to:• Demonstrate impact• Determine value• Understand needs• Uncover opportunities• Drive decision-making, prioritization• Engage community members (side-effect)
    • 3. Today’s Schedule• Part 1: Planning Assessment – Overview and process – Scoping exercise• Part 2: Conducting Assessment – Effective collection of usage data – Issues in data analysis• Discussion
    • 4. Assessment Process
    • 5. Assessment Process• Define objectives• Develop questions Scope• Determine method(s)
    • 6. Assessment Process• Define objectives• Develop questions Scope• Determine method(s)• Collect data• Analyze and interpret Conduct• Identify gaps
    • 7. Assessment Process• Define objectives• Develop questions Scope• Determine method(s)• Collect data• Analyze and interpret Conduct• Identify gaps• Derive recommendations Synthesize• Communicate results
    • 8. Scoping Assessment
    • 9. Define Objective(s)• Purpose of the assessment: the “why”• What you need to know in order to: – make a particular decision – define a strategic direction – determine service value – demonstrate impact• Could be one or several, depending on scope of service, future vision
    • 10. A Note on Impact• Lack of standard, meaningful metrics• What are the performance indicators?• Hinges on value – has this been defined?• “Digital projects and programmes need to engage with the core principle of impact assessment: how does this change people’s lives?” (Tanner 2012)
    • 11. Develop Questions• What, in particular, needs to be uncovered or understood to move towards the objective?• Discrete, answerable and/or quantifiable• May be several questions per objective• Could be hierarchical
    • 12. Develop Questions OBJECTIVE: Ensure that users are finding content on our site that is relevant to their research needs. Can we improve the discovery of our digital content? If so, how? What content is What content is most What are SEO strategies least visited? visited? for this type of content? What are the What are the features What are strategies for features of this of this content? surfacing content within content? the website?Why aren’t users arriving How are users arriving at at this content? this content?
    • 13. Determine Methods• What data is necessary to answer the question?• Do we need quantitative, qualitative, or both?• How much do we need?• What is the best way to collect it?• If we can’t collect it, what are the alternatives?
    • 14. Determine Methods *not a weighted tag cloud!
    • 15. Case Study: Scoping theCalisphere Assessment
    • 16. Assessment Drivers• High effort to keep up with K-12, and possibly ancillary to constituents’ primary interests• Lots of development ideas, but not sure where to focus efforts• Decreasing digital object contributions as of late
    • 17. Objectives• We need a better picture of users and usage• We want to know what we can do to grow content, and what hinders growth• We want to define a future vision for the Calisphere website—can we expand its role?
    • 18. QuestionsWho is using the site?How are they using it?What are contributor needs?
    • 19. QuestionsWho is using the site?How are they using it? How engaged are they? What are their paths to the site? What content do they use?What are contributor needs? What are their motivations? What more can we do to meet their needs (and those of their users)? How do they perceive our service?
    • 20. MethodsWho is using the site? user pollHow are they using it? usage stats analysis How engaged are they? What are their paths to the site? What content do they use?What are contributor needs? interviews What are their motivations? What more can we do to meet their needs (and those of their users)? How do they perceive our service?
    • 21. Activity: Scoping Exercise http://bit.ly/dlf_worksheet
    • 22. Part 2: Conducting Assessment Data Collection
    • 23. One-Question User Poll Total responses Response RateOAC 2,162 21%Calisphere 1,283 15%
    • 24. Developing the Poll• Requirements, keeping in mind: – Technical feasibility – End-user experience – Data collection – Privacy issues• Comparative analysis – Third-party option or build it ourselves?
    • 25. Developing the Poll• Verdict: build it ourselves• Specs: – jQuery dialog pop-up – Inviting design and colors – Flash cookies to prevent repeat surveys on three domains – Google Analytics “event actions” on choices
    • 26. Collecting Data in GADistribution of answers
    • 27. Collecting Data in GAOAC “Other” responses
    • 28. Visualizing the Data
    • 29. Developing the PollSteal Brian Tingle’s code:http://code.google.com/p/json4lib/source/browse/#hg%2Fsurvey
    • 30. Usage Stats Analysis• Our questions: – How engaged are our users? – What are the paths to our websites? – What content is used?• The challenge: answer those questions using...
    • 31. Configuring Web Analytics• Weird set-up for us: one profile, three domains• Learning curve – How much data do I need to recognize trends? – What’s there, what’s not? – How do I manipulate the tool to answer my questions? – Is there more we need to set up to get what I need?• Implement events, goals, custom reports: collaborate with developers
    • 32. Configuring Web Analytics• Assigned events to user poll• Created custom segments based on the events
    • 33. User Data Magic! Calisphere Traffic Sources by SegmentSite-wide during survey K-12 teachersK-12 students College/graduate students
    • 34. User Data Magic! OAC: Content Types Visited by College/Grad Students Any finding aid Any digital object Custom search result setAny "view" of finding aid or text Any institution "homepage" OAC homepage About page Institution map Custom "items" search results 0 100 200 300 400 500 600 700 800 900 1000 Number of Pageviews
    • 35. Identifying Metrics• Example question: How engaged are our users?• Simple metrics: – Time on site – Pages viewed – Frequency – Returning vs. non-returning• Fancier metrics: – Frequency over pages viewed
    • 36. Identifying Metrics• It can get complicated!“I want to know the number of K-12 teachers who tookthe survey on any given page of Calisphere” – How many = metric “unique visitors” – Specific user group = segment “k-12 teacher” – Which pages = dimension “landing page” – [Calisphere, not OAC = filter by domain]
    • 37. Recommended Reading• Avinash Kaushik, Web Analytics an Hour a Day : http://www.worldcat.org/title/web-analytics-an-hour-a- day/oclc/442451627
    • 38. Issues in Data Analysis
    • 39. Confidence in Data Collected• Check your methodology – Are you getting the data you anticipated? – Did you encounter any problems? – Is there a need to re-do efforts?• Does data set indicate trend or anomaly? – Can it be replicated?
    • 40. Confidence in Data Collected Calisphere User Segments Identified by Poll 133, 10% K-12 teacher or librarian, 10% 220, 17%68, 5% K-12 student, 22% 280, 22%55, 4% College or graduate student, 35%82, 7% Faculty or academic researcher, 7% 445, 35% Archivist or librarian, 4%
    • 41. Limits to Usage Stats• Tell us nothing about user motivations “The best that all this data will help you understand is what happened. It cannot, no matter how much you torture the data, tell you why something happened.” Avinash Kaushik
    • 42. Limits to Usage Stats• Massive, complicated websites = massive, complicated, and sometimes weird data (useless)
    • 43. Limits to Usage Stats• Benchmarks: what’s “good”?• What’s the right metric?
    • 44. Open Discussion
    • 45. References / Resources• Simon Tanner, Measuring the Impact of Digital Resources: The Balanced Impact Model: http://www.kdcs.kcl.ac.uk/innovation/impact.html• RBM Assessment Issue - Fall 2012; 13(2): http://rbm.acrl.org/content/13/2.toc• Avinash Kaushik, Web Analytics an Hour a Day: http://www.worldcat.org/title/web- analytics-an-hour-a-day/oclc/442451627• OAC/Calisphere pop-up poll code: http://code.google.com/p/json4lib/source/browse/#hg%2Fsurvey• OAC/Calisphere 2011-2012 assessment final report: http://www.cdlib.org/services/dsc/calisphere/docs/oac_calisphere_assessment_summ ary_report_2012.pdf• Scoping worksheet: http://bit.ly/dlf_worksheet
    • 46. Image Attribution• Slide 1: “Monrovia Library Orange Computers” Flickr user: Living in Monrovia http://www.flickr.com/photos/livinginmonrovia/3600590163/