Altmetrics, Impact Analysis
and Scholarly Communication
Presentation to the University of Cape Town Emerging Researchers Programme
Michelle Willmers
Project Manager: OpenUCT Initiative
5 June 2014
CC-BY-NC
Premise
Impact is more than citation and
scholarly communication comprises
more than journal articles and book
chapters
Impact is defined as the social, financial and environmental effects of
research. Planning and capturing impact however is a difficult and
resource-intensive activity, demanding both strategic commitment and
infrastructure support. A means to systematically capture and monitor
impact across the organisation is crucial to continued research success. In
addition, with impact data capture as an emerging practice, there is the
opportunity and necessity for a degree of standardisation in the
approach to measuring impact across HEIs.
Fedorciow & Bayley 2014
Why a new focus on engaging with
Impact?
1. The world is changing and content usage paradigms have changed too. New
forms of impact assessment are required.
2. Networking and collaboration is key in 21st
century scholarship. New indicators
for assessing impact reflect this.
3. Demands for new forms of business intelligence in HE management.
4. Academics want to be see their outputs utilised (and need egos stroked)
5. Funders are increasingly demanding evidence of downstream usage and uptake.
6. African HE has a strong role to play in development context > the imperative for
our work to “make a difference” is stronger than ever
7. Everybody else is doing it (we need to stay abreast in global conversation)
8. Practice is giving rise to new disciplines and approaches to scholarship
(Altmetrics, Research Uptake, etc.)
Challenges and defining
characteristics of Impact analysis
“Our results indicate that the notion of scientific impact is a multi-dimensional construct that
cannot be adequately measured by any single indicator, although some measures are more
suitable than others.” (Bollen et al. 2009)
1. Impact is relative
(You make decisions about what is important to you and to your discipline/network > no
longer a neat “just one number” solution)
Values ImpactMission
Rewards &
Incentives
Relevance
Prestige
Challenges and characteristics
(cont.)
2. It takes time to demonstrate … in some disciplines more than others
3. Attributing impact correlation can be really hard
4. It requires data (systems to generate and time/methods to analyse)
5. It is bound up in socio-political (and economic) paradigm of Northern-driven 20th
century publishing hierarchies
6. One of the hardest things in a digital space is to figure out what to measure and
what to ignore.
7. Disciplinary bias favours life sciences in current practice
8. Balancing individual focus with coherence across HEIs and international
environment
9. “The advantage of simple citation counts, and Impact Factors for that matter, is that
they are discrete – they are scores that can be compared against one another in a
univariate fashion. But the network is not univariate; online measurement is
necessarily multivariate, and figuring out the best predictors and how they should be
measured is a far more complicated probabilistic design and statistical methodology.”
(Brantley 2013)
We are communicating a wider range of outputs (in many electronic formats) to broader
audience. We are also exposed to more digital information and witnessing changing
research methods. This new system requires an expanded view of and more sophisticated
approach towards Impact.
BlogsBlogs ArticlesArticles BooksBooks TweetsTweets Working
Papers
Working
Papers
Policy
Briefs
Policy
Briefs
Conference
Papers
Conference
Papers
In order to track the impact of these artifacts they must be curated and preserved at
institutional/national/regional level (NOT solely by publishers). This requires
e-infrastructure, curatorial capacity and new roles and responsibilities.
RepositoriesRepositories ArchivesArchives WebsitesWebsites
CitationCitation AltmetricsAltmetrics AnalyticsAnalytics
Curation and preservation enables us to track, preserve and audit our scholarly
communication practice in new ways
This process of tracking content is being explored in a number of
transforming and emerging disciplines, each with their own affiliated
communities and disciplinary leanings. It is up to academics and
institutions to make sense of the terrain in their own context.
(1) Open Access is the fuel that drives
the scholarly communication machine
The case for Open Access within a university is not simply political or
economic or professional. It needs to rest in the notion of what a
university is and what it should be … It is central to the university’s
position in the public space”
Professor Martin Hall, Vice Chancellor, University of Salford, UK
Within this new paradigm…
(2) Open licensing mechanisms provide
security, author protection and machine
interoperability
(2) Time is the central commodity
According to Brody and Harnad (2005), it takes five
years for a paper in physics to receive half of the
cited-by references that the article will ever acquire.
If you want to keep pace with your researchers, you
cannot make collection decisions based on five-year
old information. […] The biggest problem in using JIF
and others is that in today’s research landscape they
are lagging indicators.
Michalek et al. 2014
(3) The network is a key mechanism for facilitating
research, obtaining funding, disseminating outputs
and gauging application/rigour/impact
(4) New systems of impact analysis and
Altmetrics rely on proxies or indicators
that become meaningful based on context
and strategic vision
Within this new system data on downstream access to and application of
research is understood in the context of combined quantitative and
qualitative approach forming an Impact narrative
> This narrative is shaped by our values and mission (and the networks
we are part of and value)
(5) Global interrogation of the JIF
gaining momentum…
(Matched locally by increasing interrogation of the
SAPSE reward system)
So…
How do we begin to engage?
1. Establish and support the inter-relationship
between research management, impact and curation
The impact agenda does not only serve external assessment or funding activities, but also the
core business of the organisation. Most fundamentally, data on the effects of research support
individual and institutional performance monitoring and give valuable feedback on public
engagement with research. More strategically, such information underpins discourse and
feedback across the academic/non-academic divide, creating a more generative
context for impact creation. Where both the creators (academics) and the recipients
(stakeholders) can collaborate effectively these benefits can be maximised, breeding further
strategic success and creating a ‘virtuous circle’ of good impact.
Fedorciow & Bayley 2014
2. Curate to participate
• Recognise and support growing cohort of scholarly
communication officers, research uptake managers, IT
support, etc.
• Commit ongoing resources for long-term sustainability
3. Initiate and promote institutional dialogue
• Increasing importance of libraries
• Articulate networks and areas of desired impact
• Transforming reward and incentive systems
• Confidence to serve the institutional mission
4.
Explore
Current challenges in applying
Altmetrics to Southern African research
(Neylon et al. 2014)
1. Participating institutions do not maintain good records of their own institutional
outputs
2. Participating researchers do not retain high quality information on the location
and identification of their own research outputs
3. Available bibliographic metadata available from institutions and researchers is
poor and creates a large workload in preparing content datasets. Systems that
collect identifiers and online locations for research outputs will significantly
reduce the workload involved in obtaining usage and impact data.
4. For those outputs with unique identifiers (such as DOIs, PMIDs) it is
straightforward to obtain data on their use. Available data is therefore often
skewed towards STEM outputs.
5. There is limited evidence of social media activity around outputs from SCAP
institutions. There may be opportunities for organisations to use social media to
increase engagement.
References
Bollen J, Van De Sompel H, Hagberg A & Chute R (2009) A principle component analysis of 39 scientific
impact measures. PLOSone 4(6): e6022. DOI: 10.371/journal.pone.0006022. Available at
http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0006022
Fedorciow L & Bayley J (2014) Strategies for the management and adoption of impact capture
processes within research information management systems. Procedia Computer Science 00 (2014)
http://dspacecris.eurocris.org/jspui/bitstream/123456789/182/1/
Fedorciow_Bayley_CRIS2014_Rome.pdf
Hall M (2014) The case for Open Access within a university. Bournemouth University Research Blog.http
://blogs.bournemouth.ac.uk/research/2014/05/28/the-case-for-open-access-within-a-university/
Michalek A, Buschman M & McEvoy K (2014) Analyze this: Altmetrics and your collection – Statistics
and collection development. Against the Grain, April 2014.
http://www.plumanalytics.com/downloads/v26-2_AnalyzeThis.pdf
Neylon C, Willmers M and King T (2014) Rethinking Impact: Applying Altmetrics to Southern African
Research. Working Paper 1, Scholarly Communication in Africa Programme.
http://openuct.uct.ac.za/sites/default/files/media/SCAP_Paper_1_Neylon_et_al_Rethinking_Impact.pd
f

Altmetrics, Impact Analysis and Scholarly Communication

  • 1.
    Altmetrics, Impact Analysis andScholarly Communication Presentation to the University of Cape Town Emerging Researchers Programme Michelle Willmers Project Manager: OpenUCT Initiative 5 June 2014 CC-BY-NC
  • 2.
    Premise Impact is morethan citation and scholarly communication comprises more than journal articles and book chapters Impact is defined as the social, financial and environmental effects of research. Planning and capturing impact however is a difficult and resource-intensive activity, demanding both strategic commitment and infrastructure support. A means to systematically capture and monitor impact across the organisation is crucial to continued research success. In addition, with impact data capture as an emerging practice, there is the opportunity and necessity for a degree of standardisation in the approach to measuring impact across HEIs. Fedorciow & Bayley 2014
  • 4.
    Why a newfocus on engaging with Impact? 1. The world is changing and content usage paradigms have changed too. New forms of impact assessment are required. 2. Networking and collaboration is key in 21st century scholarship. New indicators for assessing impact reflect this. 3. Demands for new forms of business intelligence in HE management. 4. Academics want to be see their outputs utilised (and need egos stroked) 5. Funders are increasingly demanding evidence of downstream usage and uptake. 6. African HE has a strong role to play in development context > the imperative for our work to “make a difference” is stronger than ever 7. Everybody else is doing it (we need to stay abreast in global conversation) 8. Practice is giving rise to new disciplines and approaches to scholarship (Altmetrics, Research Uptake, etc.)
  • 5.
    Challenges and defining characteristicsof Impact analysis “Our results indicate that the notion of scientific impact is a multi-dimensional construct that cannot be adequately measured by any single indicator, although some measures are more suitable than others.” (Bollen et al. 2009) 1. Impact is relative (You make decisions about what is important to you and to your discipline/network > no longer a neat “just one number” solution) Values ImpactMission Rewards & Incentives Relevance Prestige
  • 6.
    Challenges and characteristics (cont.) 2.It takes time to demonstrate … in some disciplines more than others 3. Attributing impact correlation can be really hard 4. It requires data (systems to generate and time/methods to analyse) 5. It is bound up in socio-political (and economic) paradigm of Northern-driven 20th century publishing hierarchies 6. One of the hardest things in a digital space is to figure out what to measure and what to ignore. 7. Disciplinary bias favours life sciences in current practice 8. Balancing individual focus with coherence across HEIs and international environment 9. “The advantage of simple citation counts, and Impact Factors for that matter, is that they are discrete – they are scores that can be compared against one another in a univariate fashion. But the network is not univariate; online measurement is necessarily multivariate, and figuring out the best predictors and how they should be measured is a far more complicated probabilistic design and statistical methodology.” (Brantley 2013)
  • 7.
    We are communicatinga wider range of outputs (in many electronic formats) to broader audience. We are also exposed to more digital information and witnessing changing research methods. This new system requires an expanded view of and more sophisticated approach towards Impact. BlogsBlogs ArticlesArticles BooksBooks TweetsTweets Working Papers Working Papers Policy Briefs Policy Briefs Conference Papers Conference Papers In order to track the impact of these artifacts they must be curated and preserved at institutional/national/regional level (NOT solely by publishers). This requires e-infrastructure, curatorial capacity and new roles and responsibilities. RepositoriesRepositories ArchivesArchives WebsitesWebsites CitationCitation AltmetricsAltmetrics AnalyticsAnalytics Curation and preservation enables us to track, preserve and audit our scholarly communication practice in new ways
  • 8.
    This process oftracking content is being explored in a number of transforming and emerging disciplines, each with their own affiliated communities and disciplinary leanings. It is up to academics and institutions to make sense of the terrain in their own context.
  • 9.
    (1) Open Accessis the fuel that drives the scholarly communication machine The case for Open Access within a university is not simply political or economic or professional. It needs to rest in the notion of what a university is and what it should be … It is central to the university’s position in the public space” Professor Martin Hall, Vice Chancellor, University of Salford, UK Within this new paradigm…
  • 10.
    (2) Open licensingmechanisms provide security, author protection and machine interoperability
  • 11.
    (2) Time isthe central commodity According to Brody and Harnad (2005), it takes five years for a paper in physics to receive half of the cited-by references that the article will ever acquire. If you want to keep pace with your researchers, you cannot make collection decisions based on five-year old information. […] The biggest problem in using JIF and others is that in today’s research landscape they are lagging indicators. Michalek et al. 2014
  • 12.
    (3) The networkis a key mechanism for facilitating research, obtaining funding, disseminating outputs and gauging application/rigour/impact
  • 13.
    (4) New systemsof impact analysis and Altmetrics rely on proxies or indicators that become meaningful based on context and strategic vision Within this new system data on downstream access to and application of research is understood in the context of combined quantitative and qualitative approach forming an Impact narrative > This narrative is shaped by our values and mission (and the networks we are part of and value)
  • 14.
    (5) Global interrogationof the JIF gaining momentum…
  • 15.
    (Matched locally byincreasing interrogation of the SAPSE reward system)
  • 19.
    So… How do webegin to engage?
  • 20.
    1. Establish andsupport the inter-relationship between research management, impact and curation The impact agenda does not only serve external assessment or funding activities, but also the core business of the organisation. Most fundamentally, data on the effects of research support individual and institutional performance monitoring and give valuable feedback on public engagement with research. More strategically, such information underpins discourse and feedback across the academic/non-academic divide, creating a more generative context for impact creation. Where both the creators (academics) and the recipients (stakeholders) can collaborate effectively these benefits can be maximised, breeding further strategic success and creating a ‘virtuous circle’ of good impact. Fedorciow & Bayley 2014
  • 21.
    2. Curate toparticipate • Recognise and support growing cohort of scholarly communication officers, research uptake managers, IT support, etc. • Commit ongoing resources for long-term sustainability
  • 22.
    3. Initiate andpromote institutional dialogue • Increasing importance of libraries • Articulate networks and areas of desired impact • Transforming reward and incentive systems • Confidence to serve the institutional mission
  • 23.
  • 24.
    Current challenges inapplying Altmetrics to Southern African research (Neylon et al. 2014) 1. Participating institutions do not maintain good records of their own institutional outputs 2. Participating researchers do not retain high quality information on the location and identification of their own research outputs 3. Available bibliographic metadata available from institutions and researchers is poor and creates a large workload in preparing content datasets. Systems that collect identifiers and online locations for research outputs will significantly reduce the workload involved in obtaining usage and impact data. 4. For those outputs with unique identifiers (such as DOIs, PMIDs) it is straightforward to obtain data on their use. Available data is therefore often skewed towards STEM outputs. 5. There is limited evidence of social media activity around outputs from SCAP institutions. There may be opportunities for organisations to use social media to increase engagement.
  • 25.
    References Bollen J, VanDe Sompel H, Hagberg A & Chute R (2009) A principle component analysis of 39 scientific impact measures. PLOSone 4(6): e6022. DOI: 10.371/journal.pone.0006022. Available at http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0006022 Fedorciow L & Bayley J (2014) Strategies for the management and adoption of impact capture processes within research information management systems. Procedia Computer Science 00 (2014) http://dspacecris.eurocris.org/jspui/bitstream/123456789/182/1/ Fedorciow_Bayley_CRIS2014_Rome.pdf Hall M (2014) The case for Open Access within a university. Bournemouth University Research Blog.http ://blogs.bournemouth.ac.uk/research/2014/05/28/the-case-for-open-access-within-a-university/ Michalek A, Buschman M & McEvoy K (2014) Analyze this: Altmetrics and your collection – Statistics and collection development. Against the Grain, April 2014. http://www.plumanalytics.com/downloads/v26-2_AnalyzeThis.pdf Neylon C, Willmers M and King T (2014) Rethinking Impact: Applying Altmetrics to Southern African Research. Working Paper 1, Scholarly Communication in Africa Programme. http://openuct.uct.ac.za/sites/default/files/media/SCAP_Paper_1_Neylon_et_al_Rethinking_Impact.pd f

Editor's Notes

  • #10 http://blogs.bournemouth.ac.uk/research/2014/05/28/the-case-for-open-access-within-a-university/
  • #12 Michalek et al. (2014) http://www.plumanalytics.com/downloads/v26-2_AnalyzeThis.pdf
  • #16 http://ndabaonline.ukzn.ac.za/UkzndabaStory/NdabaOnline-Vol2-Issue31/The%20UKZN%20Griot%20Of%20Bean%20Counters%20and%20Evolution
  • #17 http://theconversation.com/do-not-resuscitate-the-journal-impact-factor-declared-dead-14480?utm_source=buffer&utm_medium=twitter&utm_campaign=Buffer&utm_content=bufferc2604