Measure for MeasureThe role of metrics in assessing research performanceSociety for Scholarly Publishing - June 2013Michae...
1. What level am I assessing?– Article, Journal, Researcher, Institution, etc.Which metric should I use?
1. What level am I assessing?– Article, Journal, Researcher, Institution, etc.2. What type of impact am I assessing?– Rese...
1. What level am I assessing?– Article, Journal, Researcher, Institution, etc.2. What type of impact am I assessing?– Rese...
Article-Level-Metrics Altmetrics
Article-Level-Metrics AltmetricsImage from:http://www.plosone.org/article/metrics/info%3Adoi%2F10.1371%2Fjournal.pone.0006...
http://am.ascb.org/dora/
Background & approachWho & when: 54,442 individuals were randomly selected from Scopus.They were approached to complete th...
9Most widely known by researchersImpact Factor (n=2,520)* 82%H-Index (n=1,335) 43%Journal Usage Factor (n=309) 10%Altmetri...
Awareness10Q2 Which of these do you think are most useful at measuring research quality? (Select up to 3)64%29%29%28%58%37...
Generally, metrics with the highest awareness arealso considered to be the most useful11Impact factorSNIPSJREigenfactorh-i...
13Assessing the usefulness of potential qualitymetrics: by ageSignificantdifferencebetweensubsetandtotal (subset higher)...
14Assessing the usefulness of potentialquality metrics: by region (1 of 2)Significantdifferencebetweensubsetandtotal (sub...
15Assessing the usefulness of potential qualitymetrics: by region (2 of 2)Significantdifferencebetweensubsetandtotal (sub...
“For publishersGreatly reduce emphasis on the journalimpact factor as a promotional tool, ideally by ceasing topromote the...
 Transparency: Calculated by independent third-parties; freely and publiclyaccessible www.journalmetrics.com Subject Fie...
Modified SNIP• Refined metric calculation,bettercorrectsfor field differences• Outlier scores are closer to average• Readi...
+ + +A journal’s rawimpact per paperCitation potential inits subject fieldPeer reviewedpapers onlyA field’s frequencyand i...
SNIP: Molecular Biology VS MathematicsJournal RIP Cit. Pot. SNIP (RIP/Cit. Pot.)Inventionesmathematicae 1.5 0.4 3.8Molecul...
APIs are available to easily + freely embed thesemetrics on your journal homepages
Snowball Metrics …• Support universities’ strategic decisionmaking processes• Aim to encompass the entire scope of keyrese...
Vision for Snowball MetricsSnowball Metrics drive quality and efficiency across highereducation’s research and enterprise ...
Snowball Metrics Recipe Book25Agreed and tested methodologies for new SnowballMetrics, and versions of existing Snowball M...
1. Choose methods + metrics appropriate to level and impact typebeing assessed (DORA)2. Don’t confuse level with type (alm...
Michael Habib, MSLSProduct Manager, Scopushabib@elsevier.comTwitter: @habibhttp://orcid.org/0000-0002-8860-7565http://www....
Measure for Measure: The role of metrics in assessing research performance - Society for Scholarly Publishing. June 2013
Upcoming SlideShare
Loading in …5
×

Measure for Measure: The role of metrics in assessing research performance - Society for Scholarly Publishing. June 2013

6,130 views
5,964 views

Published on

Published in: Education, Technology

Measure for Measure: The role of metrics in assessing research performance - Society for Scholarly Publishing. June 2013

  1. 1. Measure for MeasureThe role of metrics in assessing research performanceSociety for Scholarly Publishing - June 2013MichaelHabib, MSLSProduct Manager, Scopushabib@elsevier.comTwitter: @habibhttp://orcid.org/0000-0002-8860-7565
  2. 2. 1. What level am I assessing?– Article, Journal, Researcher, Institution, etc.Which metric should I use?
  3. 3. 1. What level am I assessing?– Article, Journal, Researcher, Institution, etc.2. What type of impact am I assessing?– Research, Clinical, Social, Educational, etc.Which metric should I use?
  4. 4. 1. What level am I assessing?– Article, Journal, Researcher, Institution, etc.2. What type of impact am I assessing?– Research, Clinical, Social, Educational, etc.3. What methods are available based on above?– Metrics: Citation, Usage, Media, h-index, SNIP, SJR, etc.– Qualitative: Peer-Review, etc.Which metric should I use?
  5. 5. Article-Level-Metrics Altmetrics
  6. 6. Article-Level-Metrics AltmetricsImage from:http://www.plosone.org/article/metrics/info%3Adoi%2F10.1371%2Fjournal.pone.0006022APIs are available to easily + freely embed ScopusCited-by countson your Article pages
  7. 7. http://am.ascb.org/dora/
  8. 8. Background & approachWho & when: 54,442 individuals were randomly selected from Scopus.They were approached to complete the study in October2012.To ensurean unbiasedresponseElsevier’sname was only revealed atthe end ofthe survey.Responses:The online survey took around 15-20 minutes to complete.3,090 respondentscompletedit, representing a response rate of 5.7%.Data has not beenweighted. There was a representative response bycountry and discipline.Statisticaltesting: Error margin ± 1.5%, at 90% confidencelevels.Whencomparing the score formain group and sub-groups we have used a Z testof proportion to identify differencesbetweenthe overallaverage and thesub-group (90% confidence levels),when there are 30 or more responses.8Adrian Mulligan, Gemma Deakin and Rebekah DuttonElsevier Research & Academic Relations
  9. 9. 9Most widely known by researchersImpact Factor (n=2,520)* 82%H-Index (n=1,335) 43%Journal Usage Factor (n=309) 10%Altmetrics (n=41)* 1%Impact Factor is published by Thomson Reuters, Altmetrics were least well known
  10. 10. Awareness10Q2 Which of these do you think are most useful at measuring research quality? (Select up to 3)64%29%29%28%58%37%34%42%0% 20% 40% 60% 80% 100%Impact factor (n=2,530)SNIP (n=51)SJR (n=126)Eigenfactor (n=285)h-index (n=1,335)Journal Usage Factor (n=309)F1000 (n=155)Alt-metrics (n=41)* Only people who said they were aware of a particular metric in Q1 were given the opportunity toselect that metric in Q2, *See appendix for background and approach.Research by Elsevier Research& Academic Relations. ImpactFactor is published by Thomson Reuters,TOTAL(n=3,090)82%2%4%9%43%10%5%1%Researcher perception of most useful% useful
  11. 11. Generally, metrics with the highest awareness arealso considered to be the most useful11Impact factorSNIPSJREigenfactorh-indexJournal Usage FactorF1000Alt-metricsR² = 0.69720%10%20%30%40%50%60%70%0% 10% 20% 30% 40% 50% 60% 70% 80% 90%PercentageofawarerespondentsthatchosethemetricasoneofthemostusefulPercentage of respondents that are aware of the metricThe trendline showsthe lineartrendfor the relationshipbetweenawarenessandusageof metricsMetricsabove the line havelowerlevelsof awareness,butare more likelytobe ratedasuseful thanthe typicalawareness-usage relationshipMetricsbelowthe line havehigherlevelsof awareness,butare lesslikelytobe ratedas useful thanthe typicalawareness-usage relationship*See appendix for background and approach.Research by Elsevier Research & Academic Relations.ImpactFactor is published by Thomson Reuters,
  12. 12. 13Assessing the usefulness of potential qualitymetrics: by ageSignificantdifferencebetweensubsetandtotal (subset higher)Significantdifferencebetweensubsetandtotal (subset lower)Under 36 (n=540) 36-45 (n=920) 46-55 (n=819) 56-65 (n=507) Over 65 (n=242)TOTAL(n=3,090)Articleviews/downloads (forarticles) 43%Citations frommaterials thatareinrepositories  43%Share in socialnetwork mentions (forarticles)   16%Number of readers(for articles)  40%Number of followers(for researchers)   31%Votes or ratings (forarticles)   24%A metric that measuresthe contribution anindividual makes to peerreview (for researchers) 28%A score basedonreviewer assessment (forarticles) 28%Q3 Thinking about possible new measures of research productivity, how useful do you think the below would be in assessing the quality of aresearcher or a researcharticle?(By age) % Think it would be extremely/veryuseful43%49%21%42%38%35%34%33%44%45%18%41%33%24%29%29%45%41%15%39%28%22%27%27%44%41%12%41%30%22%26%27%36%37%13%35%30%19%24%27%
  13. 13. 14Assessing the usefulness of potentialquality metrics: by region (1 of 2)Significantdifferencebetweensubsetandtotal (subset higher)Significantdifferencebetweensubsetandtotal (subset lower)Africa(n=72)APAC(n=803)Eastern Europe(n=183)Latin America(n=182)TOTAL(n=3,090)Articleviews/downloads (forarticles)     43%Citations frommaterials thatarein repositories    43%Share in social network mentions(for articles)    16%Number of readers (for articles)  40%Number of followers (forresearchers)   31%Votes or ratings (for articles)    24%A metric that measures thecontribution an individual makesto peer review (for researchers)  28%A scorebased on reviewerassessment(for articles)    28%Q3 Thinking about possible new measures of research productivity, how useful do you think the below would be in assessing the qualityof a researcher or a research article? (By region, slide 1 of 2) % Think it would be extremely/veryuseful56%51%26%49%36%33%40%44%50%55%27%46%46%29%35%36%50%49%19%45%41%30%28%26%50%49%21%45%34%24%32%35%
  14. 14. 15Assessing the usefulness of potential qualitymetrics: by region (2 of 2)Significantdifferencebetweensubsetandtotal (subset higher)Significantdifferencebetweensubsetandtotal (subset lower)Middle East (n=47) North America (n=770) Western Europe (n=1,033)TOTAL(n=3,090)Articleviews/downloads (forarticles)  43%Citations frommaterials thatare in repositories  43%Share in social networkmentions (for articles)   16%Number of readers (for articles)   40%Number of followers (forresearchers)   31%Votes or ratings (for articles)   24%A metric that measures thecontribution an individualmakes to peer review (forresearchers) 28%A scorebased on reviewerassessment(for articles)  28%Q3 Thinking about possible new measures of research productivity, how useful do you think the below would be in assessing the quality ofa researcher or a research article? (By region, slide 2 of 2) % Think it would be extremely/veryuseful40%40%19%43%32%28%32%34%41%42%10%36%23%19%26%26%36%32%11%36%23%22%23%22%
  15. 15. “For publishersGreatly reduce emphasis on the journalimpact factor as a promotional tool, ideally by ceasing topromote the impact factor or by presenting themetric in the context of a variety of journal-based metrics … that provide a richer view of journalperformance.”– from The San Francisco Declaration on ResearchAssessment(DORA)(http://am.ascb.org/dora/ )
  16. 16.  Transparency: Calculated by independent third-parties; freely and publiclyaccessible www.journalmetrics.com Subject Field Normalization: allows for comparison independent of thejournals’ subject classification. Reflects most current journal scopes, thereby takingongoing changes into account 3-year citation window: demonstrably the fairest compromise Manipulation-resistant: Article type consistency. Only citations to and fromarticles, reviews, and conference papers are considered Breadth of coverage: Scopus has over 20,500 sources: 19,500 journals as wellas trade publications, proceedings and book series. Metrics based on Scopus.com: underlying database available fortransparency; Titles indexed based on transparent criteria by independent advisoryboardCONS: More complex methodology Do not take amount of review content into account Low awarenessAdvantages of SNIP & SJR
  17. 17. Modified SNIP• Refined metric calculation,bettercorrectsfor field differences• Outlier scores are closer to average• Readilyunderstandablescoringscale withan average of 1 for easy comparisonModified SJR• More prestigious nature of citationsthatcome from within the same, or a closelyrelated field• Overcome the tendency for prestigescores the quantityof journalsincreases• Readily understandablescoring scalewith an average of 1 for easycomparisonhttp://www.journalmetrics.com/
  18. 18. + + +A journal’s rawimpact per paperCitation potential inits subject fieldPeer reviewedpapers onlyA field’s frequencyand immediacyof citationDatabasecoverageJournal’s scopeand focusMeasured relative todatabase medianSNIP: Source-normalized impact per paper
  19. 19. SNIP: Molecular Biology VS MathematicsJournal RIP Cit. Pot. SNIP (RIP/Cit. Pot.)Inventionesmathematicae 1.5 0.4 3.8MolecularCell 13.0 3.2 4.0
  20. 20. APIs are available to easily + freely embed thesemetrics on your journal homepages
  21. 21. Snowball Metrics …• Support universities’ strategic decisionmaking processes• Aim to encompass the entire scope of keyresearch and enterprise activities• What is the driver? – universities’ metrics tend tosuit their data and priorities. With Snowball, theyagree to a single method so that they canbenchmark themselves against their peers• What is special about them?– owned by distinguished universities, including Oxfordand Cambridge, and not imposed by e.g. funders.Universitiestaking control of their own destiny!– Triedand tested methodologiesthat are availablefree-of-charge to the higher education sector– Academia– industry collaboration23
  22. 22. Vision for Snowball MetricsSnowball Metrics drive quality and efficiency across highereducation’s research and enterprise activities, regardless ofsystem and supplier, since they are the preferred standards usedby research-intensive universities to view their own performancewithin a global context24Snowball Metrics Project Partners
  23. 23. Snowball Metrics Recipe Book25Agreed and tested methodologies for new SnowballMetrics, and versions of existing Snowball Metrics, areand will continue to be shared free-of-charge.None of the project partners will at any stageapply any charges for the methodologies.Any organisation can use these methodologies fortheir own purposes, public service or commercial.(Extracts from Statement of intent,October2012)www.snowballmetrics.com/metrics
  24. 24. 1. Choose methods + metrics appropriate to level and impact typebeing assessed (DORA)2. Don’t confuse level with type (alms ≠ altmetrics)Free + easy to embed Scopus Cited-by counts on article pageshttp://www.developers.elsevier.com/3. Awareness of metrics correlates to acceptance, raising awarenessmatters4. APAC + younger researchers open to new metrics5. Don’t use just one metric, promote a variety of metricsFree + easy to embed SNIP/SJR on journal homepageshttp://www.journalmetrics.com/6. Choose transparent and standard methods + metricsLearn more about Snowball Metricshttp://www.snowballmetrics.com/In summary
  25. 25. Michael Habib, MSLSProduct Manager, Scopushabib@elsevier.comTwitter: @habibhttp://orcid.org/0000-0002-8860-7565http://www.mendeley.com/profiles/michael-habib/Thank you!

×