Recommendations for usage tracking for research networking systems, v.1. July 2013


Published on

List of recommendations for usage tracking and analysis for institutions with research networking systems such as VIVO, Profiles or SciVal Experts. Version 1 of these recommendations was authored by a subgroup of the CTSA Research Networking Affinity Group (Wash U, Elsevier, UCSF).

Published in: Technology, Business
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Recommendations for usage tracking for research networking systems, v.1. July 2013

  1. 1. 1 Recommendations for Research Networking Systems (RNS) Usage Tracking – Version 1, July 2013 Foreword A subgroup of the CTSA Research Networking Affinity Group (RNAG) has developed a set of recommendations for SGC3, focused on measuring the usage of research networking systems (RNS). Introduction Biomedical institutions have widely deployed institutional expertise discovery and researcher networking systems (RNS), including VIVO, Profiles RNS, and SciVal Experts. Institutions have invested in networking systems drawn by the promise of discovering complementary expertise, enabling collaboration, facilitating funding opportunity matching, and other key needs for accelerating translational research. Evidence is currently limited on the extent of usage of RNS tools and the extent to which they promote research-oriented networking. However, it is possible to measure usage of these systems using web traffic analytics tools, via analysis of web logs. The most well-known of these tools is Google Analytics, but a number of attractive alternatives are emerging.1 In addition, qualitative methods can be employed to collect feedback with respect to value and utility, especially from the user perspective. The recommendations here reflect current measures being used by 8 CTSA consortium member institutions and one commercial entity: • Harvard University • Medical University of South Carolina • Northwestern University • University of Iowa • University of Minnesota • University of California, San Francisco* • Vanderbilt University • Washington University* • Elsevier* This set of recommendations is a start. This document is meant to be a living and changing set of recommendations that will inform institutions and SGC3 of the value and power of research networking systems for accelerating translational research. For any specific institution, adopting a subset of these measures can help to shape strategy, decision making, and practical tactics. 1 Some notable examples include Google Analytics (, Site AI (, Woopra (, with NewRelic ( ) being used for performance monitoring. See Appendix 1 for more on tools. *Authored Version 1 of this document
  2. 2. 2 The structure of this document is as follows: The set of recommended RNS Usage Measures below a description of each measure and an accompanying rationale and explanation as to why the measure should be tracked. Institutional examples and case studies then describe successful strategies and tactics that have evolved from the analysis of the data, which range from informing the overall marketing strategy for adoption, to targeting specific faculty meetings for presentations and training, to gaining stakeholder and financial support. For ease of discussion and presentation, Google Analytics will be used as an example of an analytics tool to track tool usage. Recommended RNS Usage Measures For all measures: Track trends R1 Establish a baseline and track over time Track the same measures over time to illustrate patterns and responsiveness to actions (e.g., a new marketing campaign). Set expectations R2 Set expectation for trend over time For trends, set expectations for how the measures will trend. At UCSF, while they hope to see the number of 2+ minute on-campus visits per day increase over time, they expect this measure to plateau over time if the majority of the core user community comes to make appropriate use of the tool whenever needed. Separate internal from external usage R3 Distinguish between traffic internal and external to the institution It easy to distinguish between institutional and extra-institutional traffic with Google Analytics by looking at the visitor’s service provider. By creating advanced segments that show only traffic from your institution’ service provider(s) vs. those from all other service providers, you can filter UCSF set up two advanced segments: users with service provider “university of california san francisco,” and those with any other service provider. They then look at every statistic via both segments. This shows, for example, that they get over ten times as many off- campus visits as they do on-campus visits, and that on-
  3. 3. 3 every statistic through an internal vs. external lens. (Be sure to that your team’s own accesses to the site are being ignored by filtering out your IP addresses as an account-level filter.) campus visitors are more likely to arrive via campus websites than external visitors. The basics: Implement Google Analytics R4 Track basic web traffic statistics via Google Analytics Google Analytics affords easy access to basic usage analytics s so you can determine: 1. Visits to your site 2. Page views 3. Unique visitors 4. % new vs. % returning visitors 5. Bounce rate on home and profile pages 6. # pages looked at per visit 7. Average visit duration Understand search traffic R5 Use search traffic statistics to inform marketing and outreach opportunities Google Analytics shows not only the amount of traffic from public search engines like Google and Bing, but also the keywords that brought users to your site. This can be used to understand search behavior and intent. For example, if search traffic is low, it might be an opportunity do basic search engine optimization; or if users are coming to the site after searching for people’s name on Google, it may provide an opportunity to highlight their networks on the landing page. UCSF received only 2,500 visits a month from external search engines during launch month. They subsequently did search engine optimization work, and now receive over 40,000 visits a month from external search engines — 72% of all visits.
  4. 4. 4 Delve into referral traffic R6 Use referral traffic statistics to inform marketing and outreach opportunities Google Analytics also provides data on which website refer the most traffic to your system. By understanding these patterns, you can look at institutional traffic and sites and it can help to inform your outreach and marketing efforts. For example, if a web resource popular in the local research community does not refer much traffic it could be an opportunity for outreach to get links back to your system added at the appropriate spots. Comparisons across institutions can also help here. For example, the 2nd highest referring site at UCSF is the campus directory. UCSF partnered with the campus group responsible for each person’s page on the main online people directory to link back to the corresponding UCSF Profiles page; the result was a substantial increase in traffic to UCSF Profiles -- about 1,700 visits a month. Understand what’s popular R7 Understand traffic patterns to different parts of the site to understand preferential usage, and prioritize outreach and development Google Analytics allows users to see what parts of the site are getting the most traffic, and how users behave on each page, e.g. by looking at bounce. Understanding these measures is critical to understanding usage of the site. A month after launch, UCSF staff were surprised to discover that only 9% of visits begin on the home page, with most users arriving directly on profile pages. This provided the impetus to spend substantially more time improving data on profile pages, and deemphasize tweaks to the home page. Look at your geographic breakdown R8 Examine regional and global traffic patterns to inform potential economic development impact Google Analytics’s audience location demographics will show your traffic from within your region, state, nation, and the world. If you’re like many institutions, you might be surprised at how much of the traffic to your RNS comes from outside of your institution, let alone your state. In addition, look both at city, (for the institution’s city and the surrounding suburbs), as well as network provider which is often be linked to the university, sometimes even down to school or college within the university.
  5. 5. 5 Benchmarking: Define meaningful engagement R9A & R9B Track the number of “X+” minute visits to the RNS from within the institution (where “X+” represents the length of time you deem meaningful) Track the number of visits where users from within the institution view at least “Y” pages (where “Y” represents the # pages you deem meaningful) For context, we are using the model of an RNS that does not require log-in to search the network. As such, discuss and define a surrogate for what constitutes meaningful engagement with your RNS. Once defined, these measures can be found via Google Analytics; complex queries can be used through advanced segments and/or custom dashboards. At UCSF, they pay particular attention to the average number of visits from within the UCSF campus network lasting 2 minutes or longer (Google Analytics defines a visit as a period of continuous usage where the gap between subsequent page views or interactions is under half an hour). Because users do not need to log in to search UCSF Profiles, they use these 2+ minute on- campus visits as a proxy for deep active use of the product by researchers, administrators, and other staff on campus. Benchmark the traffic R10 Benchmark relative to other campus web properties, both on and off campus One way to infer relative usefulness of your RNS is to track and benchmark the RNS traffic relative to other campus properties such as the directory or the main institutional web site. At UCSF, the RNS gets as much as 60% of the visits of the campus directory from all visitors, and 12% of the visits of the campus directory from UCSF campus visitors. This signals the relative importance of this system as a resource.
  6. 6. 6 Segment populations R11 Segment and compare populations of profiled users, and of visitors (when possible) Using the data in the RNS, segment populations of engaged users to determine differing or similar patterns of use. The differences can inform outreach and training opportunities, and value can be inferred. Examples of segments: faculty rank, department, school, etc. Segments with lesser adoption could be candidates for outreach and training, whereas segments with greater adoption are candidates for interviews on why and what is going well. If the system is behind a login wall, there may also be opportunities to segment end users. By looking at UCSF Profile adoption rates (where adoption is defined as a profile that has some curation or customization completed such as an uploaded photo), they found that faculty at all levels – junior to senior – have adopted their profiles in close to similar proportions. This seems to indicate equal importance of the tool across the segments, and helps as they plan outreach and marketing for the system. Analyze where people click R12 See what your users are clicking on and use the data to improve your product A heatmap is an easy way to understand what users do on your site. It’s a visual representation showing where users click and what they do. This can help you determine what aspects of your RNS are most popular & what is overlooked. Google Analytics provides this data via in-page analytics feature, or a separate heatmap tool may be used. If certain classes of links on your site are being systematically ignored, this data can help inform user interface changes and/or marketing messages. UCSF has installed an inexpensive heat mapping tool and used that along with Google Analytics in-page analytics to prioritize improvements to their RNS home page.
  7. 7. 7 Qualitative measures: R13 Measure traffic impact resulting from various communications (e.g., PR and marketing campaigns) and then judiciously use the successful strategies in the future Analyze traffic to your RNS before and after various marketing campaigns such as emails or trainings. Try different messages using A/B testing and measure which ones are most successful. Partner with other groups on campus so all of the messages don’t come from one source. Measure traffic impact from these various sources. UCSF noticed a big spike in traffic after the Executive Vice Chancellor & Provost (EVCP) sent out an email announcing UCSF Profiles. This traffic spike was larger than any other email campaign and speaks to the reach and value of communications from UCSF’s EVCP. Also, at UCSF, groups across campus from the Library to University Relations have helped to spread the word about UCSF Profiles. These groups are on board to help promote UCSF Profiles because data mined from UCSF Profiles have facilitated those groups’ projects. Survey users on value & usage R14 Survey users for direct feedback; use the feedback to inform strategy and gain support for your system To really find out how people use the system and what value they garner from your RNS, you will need to ask them. The feedback will identify gaps and improvements, inform planning and hopefully help gain support to sustain and enhance your system. A number of SciVal Experts’ sites track numbers of inquiries and/or profile corrections as additional evidence of how many people are really looking at the site and its content. REACH-NC (NC state SciVal Experts portal) created a contact form that users fill out when trying to contact a researcher and they do not display direct contact information. The contact form goes to the campus liaison who coordinates the contact. In addition, a copy of the request goes to the REACH-NC Executive Director and Program Coordinator so they can track request and monitor connections. In addition, inquiries and feedback from profiled researcher come to a group mail box that the ELS Account Development Manager personally manages, instead of directly going to the general ELS helpdesk.
  8. 8. 8 Arizona State University had a ‘Submit Feedback’ button on the Experts site that linked to an end-user survey. (They felt that the information they were receiving was biased toward the negative and also unhelpful, so they discontinued use of the survey.) Note: this offers up a lesson on being specific about what kind of feedback you solicit, and not necessarily a warning against soliciting feedback in this way UCSF has implemented a pop up survey on UCSF Profiles that while annoying to many, has resulted in the collection of substantive feedback such as: “Great resource for finding potential research collaborators and for PhD dissertation committees” “I am hoping that it just helped me find a mentor...” “It has helped find new nursing research problems” Interventional networking: Given the significant lag time between identifying a potential collaborator and the first outcome of a successful collaboration, such as a published paper or even a funded grant award, it is difficult to directly link RNS usage to interventional networking. Additional studies need to be done to link short-term RNS usage patterns to longer-term outcome measures such as: • Number of new introductions/meetings/active collaborations • Interdisciplinary nature of new introductions/meetings/collaborators • Resultant joint grant applications and publications Enable & track collaboration right from your RNS
  9. 9. 9 R15 Enable collaboration from within your RNS Identify potential collaborations and enable communication between researchers directly from your RNS. The Michigan Corporate Relations Network (MCRN) portal functions like a multi-university Experts site that allows for searching across participating universities in Michigan (U of MI, University of Michigan’s Dearborn Campus, Michigan State University, Wayne State University at launch, and Western Michigan University shortly thereafter, and likely also Michigan Tech sometime after that). Thinking of it like an ecommerce metaphor, the site allows a businessperson interested in working with state of Michigan researchers to add researchers of interest to a list (‘shopping cart’) and the call to action (‘checkout’) process routes the researchers of interest along with a user-entered problem statement to the appropriate university business relations offices. These will be tracked centrally and by each office, hopefully with some evaluation of quantity of contacts and ‘quality’ of contacts (was it a legitimate research collaboration interest or a solicitor trying to sell something? did it turn into a collaboration?) UCSF has an operational pilot that has integrated UCSF Profiles with Salesforce Chatter (Chatter is UCSF’s enterprise social networking tool). This allows one to “follow” a person directly from UCSF Profiles and also allows a person to select a few people of interest and create a “Chatter Group” directly from UCSF Profiles.
  10. 10. 10 Appendix 1 Notes / community comments on analytic tools From various sources, including: Analytic Tool Comments Google Analytics Used as default Profiles solution Also used as default VIVO solution – see googleAnalytics.ftl FreeMarker template Woopra "all-in-one web application performance tool" Paul Albert at Weill notes: "I've personally used Woopra before. It seems to be better than Google Analytics about tracking outbound links and for watching people click through your site in real time. (Paul Albert, WCMC) New Relic: For performance monitoring, we use New Relic, which you can get a sense of here: and The best part about New Relic is that it will send a notice to other team members when you have tried to execute several malformed SPARQL queries in a row." (Paul Albert, WCMC) Site AI: "The problem with dashboards is they don’t directly provide insights or deliver knowledge about the data. Even worse, most visualizations require the user go through the mental exercise of interpreting the results. Site Ai does the analysis for you and presents the information in plain English." example: (Mark Fallu, at Griffith University) Mixpanel: "Page view counts are popular because they are easy to report, but ultimately cannot tell you how engaged your visitors are. Mixpanel lets you measure what customers do in your app by reporting actions, not page views." (Alex Viggio, University of Colorado Boulder) Chartbeat See demo CrazyEgg CrazyEgg provides data on what links users click on once they arrive on key pages or selected profiles. Click statistics are provided in several formats: heatmap, scrollmap, overlay, and ‘confetti.’ This allows us to improve design of page elements strategically to encourage interaction. (Leslie Yuan, UCSF) Qualaroo User feedback is collected via a popup survey tool; detailed reports are available online. We ask “How has UCSF Profiles helped you?” to solicit success stories and better understand use cases, but the survey functions as a customer service access point as well. Users may enter an email address if they’d like a response. (Leslie Yuan, UCSF)