Two aspects to note about traditional impact factors: they accrue to a journal rather than a specific article and it requires some time for them to develop - they will be derived after publication of an article and after sufficient time has elapsed for articles that cite the original article to have been published.
The Institute for Scientific Information (ISI) was the original developer of the impact factor. It was originally a mechanism for determining which journals should be included in the Science Citation Index (now a part of the Web of Science database (along with the Social Sciences Citation Index and the Arts & Humanities Citation Index). ISI was acquired by Thomson Reuters in the early 1990’s (but many people still refer to the ISI impact factor).
“Eigenfactor.org® is an academic research project co-founded by Jevin West and Carl Bergstrom and sponsored by the Bergstrom Lab in the Department of Biology at the University of Washington. We aim to use recent advances in network analysis and information theory to develop novel methods for evaluating the influence of scholarly periodicals and for mapping the structure of academic research.” (http://www.eigenfactor.org/about.php)
Description of scimago journal rank indicator:http://www.scimagojr.com/SCImagoJournalRank.pdf
Bateman, A. (2012, October 19). Why I love the H-index - PLOS Biologue. Retrieved from http://blogs.plos.org/biologue/2012/10/19/why-i-love-the-h-index/
Based on citations harvested in the Google Scholar database.
Why are altmetrics emerging? Because they can – facilitated by various web technologies we have new methods for discovering scholarship and additionally the ability to capture response to that scholarship:Traditional functions of the scholarly journalArchiving: permanently storing scholarship for later access.Registration: time-stamping authors' contributions to establish precedence.Dissemination: getting scholarly products out to scholars who want to read them.Certification: assessing contributions and giving “stamps of approval.”
Can track at a more granular level (e.g. the article); can give a measure of immediacy and socialization of an article
Different types of metrics can be captured/measured/reported.
Among other practices (e.g. publish in open access journals, practice self-archiving by depositing articles in PDXScholar), researchers should register for an ORCID ID. This identifier makes it easier for search engines to disambiguate author names and to consolidate metrics for a researcher.
New research networking sites are emerging and gaining recognition as tools for disseminating research, for connecting with colleagues, for finding collaborators, for finding experts.However, while these sites allow you to upload articles, PDXScholar is likely to be a better option for permanent, web-based archiving of your scholarly work. The PSU Library staff will assure that the articles archived in PDXScholar carry the appropriate copyright clearances (no nasty take down notices to worry about), and PDXScholar can supply usage data for your publications.
And ask questions of your subject librarian (http://library.pdx.edu/subjectlibrarians.html) or Scholarly Communications Coordinator, Sarah Beasley (email@example.com). If you have questions about depositing your publications in PDXScholar, ask Karen Bjork, Digital Initiatives Librarian (firstname.lastname@example.org).
Beyond the Journal Impact Factor: Altmetrics; New Ways of Measuring Impact
Beyond the Journal Impact Factor:
Altmetrics, New Ways of Measuring
Scholarly Communication Coordinator
February 26, 2014
Traditional Impact Factor
• Metrics that impute reputation and impact for the journal
based on how frequently articles published in that journal
are cited in other published articles.
ISI or Web of Science impact factor
• Published annually in the Journal Citation Reports (which
lags by a year, i.e. the most recent JCR data is for 2012)
• # of citations to articles in ABC Journal x during year
# of articles published in ABC Journal x in past two years
• 1.0 means that, on average, the articles published in ABC
Journal within the past two years have been cited one
• If a researcher were to go to the library and pick up a
random journal article and then randomly follow a cited
reference in that article, how much of the time would they
be going to X journal. That’s X journal’s eigenfactor.
• Developed by the major science, technology and
medicine (STM) publisher, Elsevier, from analytics
harvested from Elsevier’s Scopus database
• Scimago Journal Ranking is based on the Google
• Measures not just the number of citations to an article but
the prestige of the journal in which the citing article
H-index: devised for authors, but can be applied
• “The H-index measures
the maximum number of
papers N you have, all
of which have at least N
citations. So if you have
3 papers with at least 3
citations, but you don’t
have 4 papers with at
least 4 citations then
your H-index is 3.”
• Metrics at the level of the individual article
• Facilitated by various web technologies
• Can include scholarly commentary:
Article comments, download stats from publishers or full
text vendors, institutional repositories (such as PDX
• PLOS (page views, downloads, pubmed central
usage, scopus, google scholar, crossref citations)
Or social media
• News coverage
• Blog posts
• Facebook likes
Usage How many times viewed on publisher’s site?
How many times downloaded
Captures How many time bookmarked on CiteULike
How many times shared in Mendeley
Mentions Blog mentions
How many comments on publishers’ site
Social Media Facebook likes
How many shares on LinkedIn
How many tweets?
Citations Web of Science
The Altmetrics tools
• Altmetric – actually also a product name. Has been
adopted by among others several prominent STM
publishers: Springer, Nature Publishing
Group, Scopus, Biomed Central
• Impact Story – Open source altmetric tool. Draws on
variety of social and scholarly data sources.
• Plum Analytics – has recently been acquired by
Ebsco, focuses on metrics for
articles, chapters, datasets, presentations, source code.
• PLOS – has developed its own (freely available) code for
What’s a researcher to do
• ORCID ID – orcid.org
Set up profiles on popular academic sites
• Academic edu
• Research Gate
• BUT!!!!! Archive in PDX Scholar
• Building an online presence can be exhausting
• How to Build an enduring online research presence using
social networking and open science
• Article Level Metrics: a SPARC Primer
Greg Tananbaum April 2013
• Building an online presence can be exhausting!
• “ Sure, there are about 50 ways to disseminate your work online, but
most of them promise to make it a lot easier than they do. So, focus is
important. At this point, I let Google Scholar catch my publications
and citations (be patient, sometimes it takes a few days!), LinkedIn for
my “generic” professional presence (does anybody really take
LinkedIn endorsements seriously?), and Twitter for sharing mostly
worky things that I think are interesting. Note that I still have some
work to do on keeping personal and professional things separate on
Declaration on Research Assessment
The San Francisco Declaration on Research
Assessment (DORA), initiated by the American Society for Cell
Biology (ASCB) together with a group of editors and publishers
of scholarly journals, recognizes the need to improve the ways
in which the outputs of scientific research are evaluated. The
group met in December 2012 during the ASCB Annual Meeting
in San Francisco and subsequently circulated a draft
declaration among various stakeholders. DORA as it now
stands has benefited from input by many of the original signers
listed below. It is a worldwide initiative covering all scholarly
disciplines. We encourage individuals and organizations who
are concerned about the appropriate assessment of scientific
research to sign DORA.