Are Altmetrics for you?
PlumX at Saint Mary’s
The role of metrics in
institutional evaluation
 Bibliometrics
 Citation Counts
 “The value of information is determined by those
who use it” Eugene Garfield
 Journal Impact Factors
 Rank & Tenure
Why use alternative
metrics?
 Predict citation counts?
 Non-journal-based fields:
 Medical researchers developing new protocols
 Creative artists and writers
 Book-intensive disciplines
 Business management cases
 Inventions and patents
Criticisms of altmetrics
 Not a measure of quality: hype, piling on
 Apples and oranges; measuring so many
disparate categories is meaningless
 Manipulation; robo-tweeting
 Empirical evidence is slim
Why altmetrics at SMC?
 School of Business and AACSB accreditation
 Business Dean asked Business Librarian to
compile data
 Book sales
 Use of cases in classes
 Media mentions
 Sedona implementation left out artists,
musicians, novelists
 PlumX offered much of what we were seeking
What PlumX measures:
mix of metrics
PlumX at SMC
PlumX at SMC: by School
PlumX at SMC: by School
PlumX at SMC: by School
PlumX at SMC: by Format;
Books
PlumX at SMC: by artifact;
Book
PlumX at SMC: by artifact;
Article
PlumX at SMC: by
Researcher – early career
PlumX at SMC: senior faculty
PlumX at SMC:
Performing Artist
Conclusions and questions
 PlumX helps early career researchers and
non-scientists demonstrate their impact better
than traditional metrics
 Will tenure committees, accrediting bodies
and granting agencies care?
 Will librarians have the time to compile profiles
for all faculty or will we be successful lobbying
for additional staff?
 Will faculty and the Provost be impressed
enough to continue to subscribe?
Thank you!
 Slides will be posted on SlideShare
 Questions or comments?
References
 Bornmann, L. (2014). Do altmetrics point to the broader impact of research ? An overview
of benefits and disadvantages of altmetrics. Journal of Informetrics, 8(4), 895–903.
doi:10.1016/j.joi.2014.09.005
 Cameron, B. D. (2005). Trends in the Usage of ISI Bibliometric Data: Uses, Abuses, and
Implications. Portal: Libraries and the Academy, 5(1), 105–125. doi:10.1353/pla.2005.0003
 Galligan, F., & Dyas-Correia, S. (2013). Altmetrics: Rethinking the Way We Measure. Serials
Review, 39(1), 56–61. doi:10.1016/j.serrev.2013.01.003
 Garfield, E. (2005). “The Agony and the Ecstasy: The History and Meaning of the Journal
Impact Factor.” International Congress on Peer Review And Biomedical Publication, 1–22.
doi:10.1001/jama.295.1.90
 Hecht, F., Hecht, B. K., & Sandberg, A. a. (1998). The journal “impact factor”: A misnamed,
misleading, misused measure. Cancer Genetics and Cytogenetics, 104(2), 77–81.
doi:10.1016/S0165-4608(97)00459-7
 Hirsch, J. E. (2007). Does the H index have predictive power? Proceedings of the National
Academy of Sciences of the United States of America, 104(49), 19193–19198.
doi:10.1073/pnas.0707962104
 History of Citation Indexing. (n.d.). Retrieved January 19, 2015, from
http://wokinfo.com/essays/history-of-citation-indexing/
 Information systems. (2004). In Information Science in Theory and Practice. Ed. Brian C.
Vickery and Alina Vickery. (3rd ed., pp. 210–260). Munich: K. G. Saur. Retrieved from
http://site.ebrary.com/lib/stmarysca/detail.action?docID=10256701
References
 Kousha, K., & Thelwall, M. (2009). Google Book Search: Citation Analysis
for Social Science and the Humanities. JOURNAL OF THE AMERICAN
SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, 60(8), 1537–
1549. doi:10.1002/asi.21085
 Mcnutt, M. (2014). The measure of research merit. Science, 346(6214),
1155. doi:10.1126/science.aaa3796
 Nederhof, A. J. (2006). Bibliometric monitoring of research
performance in the Social Sciences and the Humanities :
Scientometrics, 66(1), 81–100. doi:10.1007/s11192-006-0007-2
 Thelwall, M., Haustein, S., Larivière, V., & Sugimoto, C. R. (2013). Do
Altmetrics Work? Twitter and Ten Other Social Web Services. PLoS ONE,
8(5), 1–7. doi:10.1371/journal.pone.0064841
 Vaughan, L., & Shaw, D. (2003). Bibliographic and Web Citations: What
Is the Difference? Journal of the American Society for Information
Science and Technology, 54(14), 1313–1322. doi:10.1002/asi.10338
 Zahedi, Z., Costas, R., & Wouters, P. (2014). How well developed are
altmetrics? A cross-disciplinary analysis of the presence of “alternative
metrics” in scientific publications. Scientometrics, 101(2), 1491–1513.
doi:10.1007/s11192-014-1264-0

AAAS 2015 plum x altmetrics @ SMC

  • 1.
    Are Altmetrics foryou? PlumX at Saint Mary’s
  • 2.
    The role ofmetrics in institutional evaluation  Bibliometrics  Citation Counts  “The value of information is determined by those who use it” Eugene Garfield  Journal Impact Factors  Rank & Tenure
  • 3.
    Why use alternative metrics? Predict citation counts?  Non-journal-based fields:  Medical researchers developing new protocols  Creative artists and writers  Book-intensive disciplines  Business management cases  Inventions and patents
  • 4.
    Criticisms of altmetrics Not a measure of quality: hype, piling on  Apples and oranges; measuring so many disparate categories is meaningless  Manipulation; robo-tweeting  Empirical evidence is slim
  • 5.
    Why altmetrics atSMC?  School of Business and AACSB accreditation  Business Dean asked Business Librarian to compile data  Book sales  Use of cases in classes  Media mentions  Sedona implementation left out artists, musicians, novelists  PlumX offered much of what we were seeking
  • 6.
  • 7.
  • 8.
    PlumX at SMC:by School
  • 9.
    PlumX at SMC:by School
  • 10.
    PlumX at SMC:by School
  • 11.
    PlumX at SMC:by Format; Books
  • 12.
    PlumX at SMC:by artifact; Book
  • 13.
    PlumX at SMC:by artifact; Article
  • 14.
    PlumX at SMC:by Researcher – early career
  • 15.
    PlumX at SMC:senior faculty
  • 16.
  • 17.
    Conclusions and questions PlumX helps early career researchers and non-scientists demonstrate their impact better than traditional metrics  Will tenure committees, accrediting bodies and granting agencies care?  Will librarians have the time to compile profiles for all faculty or will we be successful lobbying for additional staff?  Will faculty and the Provost be impressed enough to continue to subscribe?
  • 18.
    Thank you!  Slideswill be posted on SlideShare  Questions or comments?
  • 19.
    References  Bornmann, L.(2014). Do altmetrics point to the broader impact of research ? An overview of benefits and disadvantages of altmetrics. Journal of Informetrics, 8(4), 895–903. doi:10.1016/j.joi.2014.09.005  Cameron, B. D. (2005). Trends in the Usage of ISI Bibliometric Data: Uses, Abuses, and Implications. Portal: Libraries and the Academy, 5(1), 105–125. doi:10.1353/pla.2005.0003  Galligan, F., & Dyas-Correia, S. (2013). Altmetrics: Rethinking the Way We Measure. Serials Review, 39(1), 56–61. doi:10.1016/j.serrev.2013.01.003  Garfield, E. (2005). “The Agony and the Ecstasy: The History and Meaning of the Journal Impact Factor.” International Congress on Peer Review And Biomedical Publication, 1–22. doi:10.1001/jama.295.1.90  Hecht, F., Hecht, B. K., & Sandberg, A. a. (1998). The journal “impact factor”: A misnamed, misleading, misused measure. Cancer Genetics and Cytogenetics, 104(2), 77–81. doi:10.1016/S0165-4608(97)00459-7  Hirsch, J. E. (2007). Does the H index have predictive power? Proceedings of the National Academy of Sciences of the United States of America, 104(49), 19193–19198. doi:10.1073/pnas.0707962104  History of Citation Indexing. (n.d.). Retrieved January 19, 2015, from http://wokinfo.com/essays/history-of-citation-indexing/  Information systems. (2004). In Information Science in Theory and Practice. Ed. Brian C. Vickery and Alina Vickery. (3rd ed., pp. 210–260). Munich: K. G. Saur. Retrieved from http://site.ebrary.com/lib/stmarysca/detail.action?docID=10256701
  • 20.
    References  Kousha, K.,& Thelwall, M. (2009). Google Book Search: Citation Analysis for Social Science and the Humanities. JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, 60(8), 1537– 1549. doi:10.1002/asi.21085  Mcnutt, M. (2014). The measure of research merit. Science, 346(6214), 1155. doi:10.1126/science.aaa3796  Nederhof, A. J. (2006). Bibliometric monitoring of research performance in the Social Sciences and the Humanities : Scientometrics, 66(1), 81–100. doi:10.1007/s11192-006-0007-2  Thelwall, M., Haustein, S., Larivière, V., & Sugimoto, C. R. (2013). Do Altmetrics Work? Twitter and Ten Other Social Web Services. PLoS ONE, 8(5), 1–7. doi:10.1371/journal.pone.0064841  Vaughan, L., & Shaw, D. (2003). Bibliographic and Web Citations: What Is the Difference? Journal of the American Society for Information Science and Technology, 54(14), 1313–1322. doi:10.1002/asi.10338  Zahedi, Z., Costas, R., & Wouters, P. (2014). How well developed are altmetrics? A cross-disciplinary analysis of the presence of “alternative metrics” in scientific publications. Scientometrics, 101(2), 1491–1513. doi:10.1007/s11192-014-1264-0

Editor's Notes

  • #2 Hi Everyone, let me introduce myself. I’m Linda Wobbe, Head of Collection Management at Saint Mary’s College, in Moraga, a small town about 30 miles north of here. Saint Mary’s is a medium-sized liberal arts college with about 4,000 students. SMC has large master’s programs in Business & Education, and several smaller master’s programs. I volunteered to talk to you today about altmetrics, and the PlumX implementation at Saint Mary’s.
  • #3 We are all familiar with bibliometrics in general, or the use of publication data to measure research outcomes. Bibliometric studies have been used to measure researcher productivity and research value since the 1920’s.   A.J. Lotka’s 1926 study of Chemical Abstracts found that 60% of the authors in that index only published one article. His mathematical model became known as Lotka’s Law. (“Information systems,” 2004)   Better known today is Eugene Garfield’s development of citation indexes, which became commercially available in 1963. Garfield posited that “The Value of information is determined by those who use it.” (“History of Citation Indexing,” n.d.)   While the use of citation counts to measure science research impact is well-studied, their validity and use by tenure committees is debated. (Vaughan & Shaw, 2003) Journal Impact Factors were created as a way for Garfield and his partner to select journals for the citation indexes. They are calculated by dividing the total number citations to a journal over a two-year span by the total number of articles published in that journal during the same time span. Garfield disdains the use of journal impact factors as a proxy for faculty researcher quality, and has published widely arguing against that use. (Garfield, 2005) Nevertheless, rank & tenure committees often rely on journal impact factors in addition to citation counts as part of faculty tenure reviews.  
  • #4 Article citations counts and impact factors have many disadvantages. Citation counts lag far behind the publication date, as much as 5-12 years (Nederhof 2006), disadvantaging the young researchers who most need to prove their worth to tenure committees and granting agencies. Critics of ISI’s citation counts have studied other types of bibliometrics, known as alternative metrics or enhanced metrics. It has been proposed that altmetrics can predict future citations. Several studies have found correlations between various forms of altmetrics and future citation counts. Zahedi, Costas, & Wouters (2014) found a correlation between Mendeley readership and citation impact indicators. Vaughan & Shaw (2013) found that for 57% of the journals they studied, web citations predicted traditional citation counts. Another research group found tweets, Facebook wall posts and other social media mentions correlated to future citation counts (Thelwall, Haustein, Larivière, & Sugimoto, 2013). Traditional citation counts also privilege journal-based disciplines. Some researchers have proposed altmetrics as evaluative measures for book-centric disciplines.. Google Book Search, for example, was proposed as an appropriate measure for literature and philosophy by Kousha & Thelwall (2009).
  • #5 Of course, alternative metrics also have many critics. Bornmann’s 2014 article in the Journal of Informetrics is a detailed review of traditional bibliometrics, altmetrics, and the benefits and shortcomings of each. Altmetrics are not necessarily a sign of quality, but can indicate a frenzied media event, with one mention gathering duplications, with no knowledge formation in the process. Altmetrics are so disparate as to be meaningless to gather together, and they are subject to manipulation, such as robo-tweeting. Despite many studies, the empirical evidence that they measure anything is slim. Marcia McNutt, in a recent issue of Science magazine, our hosts for this conference, challenges the prevailing notion that bibliometrics or altmetrics are a useful measure to predict a young researcher’s future performance.
  • #6 Nevertheless, at SMC, we decided on a trial run of a specific altmetrics tool, PlumX, for several reasons. First, support our case for reaccreditation. The Associate Dean of Saint Mary’s School of Business felt that traditional metrics failed to measure the research and social impact of our business faculty. She asked the business librarian to come up with a way to showcase the books, business cases, online lectures, and media interviews produced by business faculty, for our regular AACSB accreditation reviews. I was determined to find a commercial solution to help with this project when I realized the giant data gathering that was being embarked upon by this librarian. At the same time, the College gave up our Sedona faculty data collection system, due in part to the lack of participation by our Liberal Arts faculty. Comparing the available altmetric products, I thought PlumX was best able to showcase both traditional and non-traditional metrics for the variety of faculty at Saint Mary’s. And with funds available due to cancellation of Sedona, our Provost agreed to give PlumX a try.
  • #7 So, let’s take a look, at some of the screenshots. PlumX, an EBSCO product, gathers a wide variety. of data from both traditional and alternative metrics. Traditional metrics such as Citations from SCOPUS, Pub Med and Pub Med Europe, Cross/ref and SSRN citations are compiled in a separate grouping from more non-traditional metrics, such as Social Media shares and likes; and Mentions: comments and wikipedia links. Usage includes You Tube, Ebsco, PLos, views, WorldCat holdings; Captures: EBSCO saves, goodreads readers, Mendeley readers; Please note: the numbers are small. SMC’s PlumX implementation is just barely getting started, and we have invited a small number of faculty to participate so far. It is a delicate maneuver to introduce this product, with no one wanting to require faculty to participate. We are scheduled to present this trial version to the faculty senate; it has already been shared with the faculty participants, the Provost funder and the Faculty Development office.
  • #8 Here is the opening page for the Saint Mary’s PlumX site. Faculty from the School of Business were required to participate. And a sample of faculty from each of the other schools were invited to participate. All science faculty who were invited were delighted to participate. The business faculty were also happy to participate. However, many of the liberal arts faculty we thought would most benefit declined the invitation. The liberal arts faculty who did agree to showcase their research in our PlumX site are delighted with the results.
  • #9 This is the School of Business page. You can view metrics by any grouping of researchers you create. We decided on a School-based hierarchy, by department, then researcher. Here’s the School of Business faculty; they can be viewed by department, by researcher, by type of research output, even by the individual metrics. Each research output “artifact” has its own detailed page as well.
  • #10 This is the lower half of the School of Business site; you can view all research outputs. Citations are low, but usage and social media are high.
  • #11 A similar view for the handful of School of Science faculty; Citation counts are almost at the same level as social media.
  • #12 You can view the metrics by any material type. This list of books published by SMC faculty is arranged by usage. Usage in PlumX includes YouTube views, EBSCO and PLos downloads and views, and WorldCat holdings. Nothing here would be noticed by traditional metrics.
  • #13 Here’s the view for an individual artifact, in this case a book. This 2012 book, a compilation of fictional self-reflections by a popular young faculty member, has been the campus read at several universities, and is held by 400+ libraries. Yet traditional metrics wouldn’t have noticed it. The author is one of the few liberal arts faculty to agree to participate in our trial. The PlumPrint shows that usage and captures are larger than mentions for this book.
  • #14 Here’s the view for an individual article. In this case, views and downloads, and citations are the highest indicators.
  • #15 Here is a view for an early career researcher. This researcher, applying for tenure, was surprised and relieved by the usage numbers for her 2012 article. With no citations, traditional metrics would reflect that this article had no impact. Yet, the article has been downloaded many times. She explained that she and her co-author hardly dared but felt obligated to publish the article, which assesses the damage to samples when studied with traditional methods, invented by the most prominent researcher in her field. No one cites the article, she speculates because they all use the prevailing technique. She and her co-author fought an uphill battle to get the article published. You can see Citations are the lowest indicator. I’m not showing the full page – at the top is the author’s picture, with links to her profile and social media channels. And the widget the author’s profile, or any individual artifact, can be embedded – it is simple html code you can generate on the fly.
  • #16 Here is the profile of a tenured professor. While usage is still high, citations are also quite high; with no social media buzz. Actually, that is somewhat surprising, because he is very active in community projects, such as remediation plans for local closed military bases.
  • #17 Here is a profile for another young faculty member, one of the few Performing Arts faculty to agreed to participate. YouTube views highlight the popularity of this young pianist. There are no traditional metrics here – no citations.
  • #18 I hope these few screenshots give you a feel for an altmetrics or really enhanced metrics view of one institution’s research output. A few conclusions and questions: So far, both the scientists and non-science faculty who have participated have been enthusiastic about their PlumX profiles. We’ve been working with the PlumX team closely, because the Business Dean and Librarians all want more than what PlumX can deliver so far. The Dean wants Barnes and Noble book sales figures, Xanadu course pack information; Librarians want ISI citation counts included. Several faculty are using these metrics in their tenure review files this year. But will tenure committees and the AACSB accrediting team value these metrics? And will we have time to complete profiles for all faculty if the Provost decides to continue our subscription?
  • #19 Thank you for listening! What do you think, would altmetrics offer something of value for your institution? These slides will be posted to my SlideShare account. Questions or comments?
  • #20 Thanks to all the researchers who provided insight.
  • #21 Thanks to all the researchers who provided insight.