“Open Research Data: Implications for Science and Society”, Warsaw, Poland, May 28–29, 2015, conference organized by the Open Science Platform — an initiative of the Interdisciplinary Centre for Mathematical and Computational Modelling at the University of Warsaw. pon.edu.pl @OpenSciPlatform #ORD2015
3. Altmetric is a data science company that tracks
attention to research outputs, delivering article
level metrics via visually engaging, intuitive
interfaces.
And we have been digging for data for a while.
Who are we?
4. What we look for
Research-
Outputs
DOIs General
PubMed ID Medical/Health
arXiv ID
Physics, Maths &
Computer Science
ADS ID
Astrophysis Data
System
SSRN ID Social Sciences
RePEC ID Economics
Handles General
5. Where are we looking for data?
NEWS OUTLETS
• Over 1,300 sites
• Manually curated
list
• Text mining
• Global coverage
SOCIAL MEDIA AND
BLOGS
Twitter, Facebook,
Google+, Sina Weibo
Public posts only
8,000+ Blogs
REFERENCE
MANAGERS
• Mendeley etc
• Demographics
OTHERS…
Youtube
Reddit
F1000
Q&A
Policy Documents…
POST
PUBLICATION
PEER REVIEW
SITES
Publons
Pubpeer
6. Policy documents as an essential source
• ASHA Practice Policy
• AWMF - Association of Scientific Medical Societies
• European Food Safety Authority (EFSA)
• Food and Agriculture Organization
• GOV.UK - Policy papers, Research & Analysis
• Intergovernmental Panel on Climate Change (IPCC)
• International Committee of the Red Cross (ICRC)
• International Monetary Fund
• (IMF) – (Tracking working papers)
• Medicins sans Frontieres (MSF)
• Mental Health Foundation (UK)
• NICE Evidence
• UNESCO
• World Health Organization (WHO)
• Australian Policy Online
• The World Bank
• International Fund for Agricultural Development
• Scottish Intercollegiate Guideline Network
And more being added each week…
7. 44K
online mentions of
scholarly articles
every day.
1 mention every
2 seconds!
50K unique articles are
shared each week.
>3.5M
articles with tracked
attention data.
The conversation is moving online..
Source: Altmetric internal data, March 2015
9. The Donut:
Colours change dynamically with attention
High traction across
all sources
Strong news
coverage
Strong Facebook
And G+/Twitter traction
Impact at a Glance
15. How is the public engaging with your research?
Do you even know how to find out?
16. Is anyone getting it wrong?
Are your marketing and communications offices aware?
Copyright Randall Munrow, XKCD, http://xkcd.com/386/
17. What we do with this data
• Aggregates output level metrics into
dashboards for:
– Individual researchers
– Departments or units within an
institution/Research Groups
– subject areas
– The entire institution
– Customized groups that include data external
to the institution
18. What we do with this data
• Streams mentions in real time
• Sorts outputs by numerous filters:
– Journal, publisher, funder, unique identifier,
PubMed query and more
• Sets up alerts
• Emails and exports reports
• Facilitates benchmarking and comparative
analysis of groups
19. Here’s how Altmetric for Institutions can
help you answer them….
Remember those questions?
20. What is happening to your research right now online?
Summary report tells you at a glance breaking m
entions out by source…
Track attention over time
21. What is happening to your research right now online?
Summary report tells you at a glance breaking m
entions out by source…
Track attention over time
24. How is the public engaging with your research?
25. Does the PR or Comms Department need to manage a
ny online discussions?
Copyright Randall Munrow, XKCD, http://xkcd.com/386/Tweet, Retweet, Reply – Right in the Interface!
26. Funders want to see a “broader impact…”
“The primary aim of SEP assessments is to reveal and confirm the
quality and the relevance of the research to society and to improve
these where necessary.”
https://www.knaw.nl/nl/actueel/publicaties/standard-evaluation-protocol-2015-2021
27. Funding pools drying up: Show me the
(broader) impact!
Grant funders looking for proof of “broader impacts”
often defined as “an effect, change, or benefit to the
economy, society, culture, public policies, health, the
environment, etc.”
Research Excellence Framework,
http://www.ref.ac.uk/panels/assessmentcriteriaandleveldefinitions/
Broaden dissemination to enhance scientific and
technological understanding, for example, by presenting
results of research and education projects in formats useful
to students, scientists and engineers, members of Congress,
teachers, and the general public.
http://www.nsf.gov/pubs/2007/nsf07046/nsf07046.jsp
Editor's Notes
Welcome Attendees.
We’ll be taking a look at how Altmetric can help researchers, departments and institutions evaluate their research impact in a timely and comprehensive manner.
Running time of 30 mins, plus questions at the end. Use the raise the hand feature for asking questions.
Please feel free to ask questions via the Q&A feature. Please also use this for any audio issues you may have.
Lets get started
What are “altmetrics”?
“alternative metrics”
new ways of measuring different, non-traditional forms of impact.
“alternative to only using citations”, not “alternative to citations”.
complementary to traditional citation-based analysis.
Article-level metrics have come to refer to any metrics (e.g., including altmetrics) that surround a scholarly article.
Altmetric is data science company
We curate an extensive list of sources, making sure each source is relevant and valuable for judging scholarly impact across emerging media.
We monitor more than 1300 news sites, and employ text mining and link tracking to pick up mentions of scholarly articles discussed in the press.
List is global - Multilingual sources. - Add to them all the time.
Read and comment on the rest of them – on Mendeley – mention latest addition of demographics.
The most recent addition to our list of sources is tracking policy docs. This way, the impact of research on public policy can be measured easily and clearly at an article level, department level or an institutional level overall.
Some policy sources include:
WHO – The IMF – GOV.uk, the Food and Agricultural Organisation, the UK’s Mental Health Association, the IPCC and many more. We add to this list on a continual basis.
Aiming to track over 40 policy sites by end of the year.
In the era of print journals, citation counts and impact factors were developed as the main source of bibilometric data.
They were never useful for applied research, aimed at practitioners and policymakers or non-traditional research outputs that don’t get published in journals.
Interest in academic outputs can be now be expressed through online news sources, blog posts, Twitter and Facebook shares, bookmarking or reading in tools like Mendeley, and critiques on online peer review and recommendation sites.
These alternative, article-level metrics metrics:
complement traditional citation-based metrics
give early intelligence about the level of interest in a research output
find engagement beyond academia
apply to all subject disciplines
In the era of print journals, citation counts and impact factors were developed as the main source of bibilometric data.
They were never useful for applied research, aimed at practitioners and policymakers or non-traditional research outputs that don’t get published in journals.
Interest in academic outputs can be now be expressed through online news sources, blog posts, Twitter and Facebook shares, bookmarking or reading in tools like Mendeley, and critiques on online peer review and recommendation sites.
These alternative, article-level metrics metrics:
complement traditional citation-based metrics
give early intelligence about the level of interest in a research output
find engagement beyond academia
apply to all subject disciplines
In the era of print journals, citation counts and impact factors were developed as the main source of bibilometric data.
They were never useful for applied research, aimed at practitioners and policymakers or non-traditional research outputs that don’t get published in journals.
Interest in academic outputs can be now be expressed through online news sources, blog posts, Twitter and Facebook shares, bookmarking or reading in tools like Mendeley, and critiques on online peer review and recommendation sites.
These alternative, article-level metrics metrics:
complement traditional citation-based metrics
give early intelligence about the level of interest in a research output
find engagement beyond academia
apply to all subject disciplines
In the era of print journals, citation counts and impact factors were developed as the main source of bibilometric data.
They were never useful for applied research, aimed at practitioners and policymakers or non-traditional research outputs that don’t get published in journals.
Interest in academic outputs can be now be expressed through online news sources, blog posts, Twitter and Facebook shares, bookmarking or reading in tools like Mendeley, and critiques on online peer review and recommendation sites.
These alternative, article-level metrics metrics:
complement traditional citation-based metrics
give early intelligence about the level of interest in a research output
find engagement beyond academia
apply to all subject disciplines
In the era of print journals, citation counts and impact factors were developed as the main source of bibilometric data.
They were never useful for applied research, aimed at practitioners and policymakers or non-traditional research outputs that don’t get published in journals.
Interest in academic outputs can be now be expressed through online news sources, blog posts, Twitter and Facebook shares, bookmarking or reading in tools like Mendeley, and critiques on online peer review and recommendation sites.
These alternative, article-level metrics metrics:
complement traditional citation-based metrics
give early intelligence about the level of interest in a research output
find engagement beyond academia
apply to all subject disciplines
In the era of print journals, citation counts and impact factors were developed as the main source of bibilometric data.
They were never useful for applied research, aimed at practitioners and policymakers or non-traditional research outputs that don’t get published in journals.
Interest in academic outputs can be now be expressed through online news sources, blog posts, Twitter and Facebook shares, bookmarking or reading in tools like Mendeley, and critiques on online peer review and recommendation sites.
These alternative, article-level metrics metrics:
complement traditional citation-based metrics
give early intelligence about the level of interest in a research output
find engagement beyond academia
apply to all subject disciplines
In the era of print journals, citation counts and impact factors were developed as the main source of bibilometric data.
They were never useful for applied research, aimed at practitioners and policymakers or non-traditional research outputs that don’t get published in journals.
Interest in academic outputs can be now be expressed through online news sources, blog posts, Twitter and Facebook shares, bookmarking or reading in tools like Mendeley, and critiques on online peer review and recommendation sites.
These alternative, article-level metrics metrics:
complement traditional citation-based metrics
give early intelligence about the level of interest in a research output
find engagement beyond academia
apply to all subject disciplines
In the era of print journals, citation counts and impact factors were developed as the main source of bibilometric data.
They were never useful for applied research, aimed at practitioners and policymakers or non-traditional research outputs that don’t get published in journals.
Interest in academic outputs can be now be expressed through online news sources, blog posts, Twitter and Facebook shares, bookmarking or reading in tools like Mendeley, and critiques on online peer review and recommendation sites.
These alternative, article-level metrics metrics:
complement traditional citation-based metrics
give early intelligence about the level of interest in a research output
find engagement beyond academia
apply to all subject disciplines
In the era of print journals, citation counts and impact factors were developed as the main source of bibilometric data.
They were never useful for applied research, aimed at practitioners and policymakers or non-traditional research outputs that don’t get published in journals.
Interest in academic outputs can be now be expressed through online news sources, blog posts, Twitter and Facebook shares, bookmarking or reading in tools like Mendeley, and critiques on online peer review and recommendation sites.
These alternative, article-level metrics metrics:
complement traditional citation-based metrics
give early intelligence about the level of interest in a research output
find engagement beyond academia
apply to all subject disciplines
As this gap has become wider, funders are now noticing that judging impact of research funded by them requires new tools, and the monitoring of new platforms.
Need more evidence about attention their work is receiving
Is funding going towards engaging work?
What other areas are getting traction within alternative metrics?
What are particular other funders getting traction with?
Can new avenues for funding be approached, or engaged with?