• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Scholarly impact metrics traditions

Scholarly impact metrics traditions






Total Views
Views on SlideShare
Embed Views



2 Embeds 276

http://blogs.ntu.edu.sg 272
http://feedreader.com 4


Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment

    Scholarly impact metrics traditions Scholarly impact metrics traditions Presentation Transcript

    • Scholarly Impact MetricsTraditions and TransformationsA guided tour to mainstream approaches tomeasuring scholarly impact, eg. citation count &impact factor, and a glimpse into newdevelopments, namely altmetrics.
    • Content• Why count citations?• What can they be used for?• Where to get citation information?• What are the metrics for individuals• What are the metrics for journals• What are the metrics for institutionsPart 1 : TRADITIONS
    • Why count publicationsand citations?To measure output and impact ofscientific research
    • Citations: In-text and ReferencesA citation is a reference to a paper /book used in one’s research.References appendedat the end of the paperin-text citation
    • CitationsACTIVITY 1With reference to the following short excerpt from a journalpaper :1. Identify where an in-text citation is needed2. Count the number of times in-text citations occurred withinthe 2 paragraphsTime : 3 - 5 minutesHave a quick read and then as a group, discuss and identifywhere an in-text citation should go and the number oftimes in-text citation occurred within the 2 paragraphs.
    • Previous studies have indicated that researchers are interested in scholarly communicationissues, with attitudes and acceptance of new models and ways of working varying amongpopulations in different fields, academic settings, and countries. In general, traditionalpublishing practices continue to be reinforced both by institutionalized structures such astenure and promotion criteria and by “a fundamentally conservative set of faculty attitudes”.New dissemination practices appear to flourish primarily for in-progress or preliminarycommunication, rather than for final, formal publication.Posting a paper to an individual’s web page has been shown to be nearly twice as common asany other method for sharing research informally. Discipline-based open archives are fairlypopular: the proportion of mathematicians depositing research papers in a subject repositoryhas been variously gauged at 38% and 20%, higher than the average for all subjects. Suchonline self-archiving, officially recommended by the International Mathematical Union, followsthe important pre-Internet tradition of sharing mathematics preprints. In pure mathematics,more researchers consider preprints the most essential resource (44%) than journal articles(approximately 33%); this was the highest preprint rating in any discipline, followed bystatistics and operational research at 25%. Besides the preprints and published articles usuallydeposited, a 2009 survey found that 14% of mathematics / statistics faculty members haddeposited raw materials such as images or field notes in a repository, and 10% had depositedraw data sets.An Excerpt from K. Fowler’s Mathematicians’ views on current publishing issues
    • Mathematicians’ views on current publishing issues, PDF
    • Why Does One Need to Cite?1. Research not done in isolation, and builds on the work of others.2. Provides evidence for arguments and adds credibility by demonstratingthat you have sought different viewpoints3. Engage in academic conversations – respond to this person, agreeing withthat person, adding something to what has been said, etc.4. Allows readers to find sources of information (footprints).among other reasons.
    • Some Issues with Citations1. Authors do not cite majority of their influences. A study byMacRoberts & MacRoberts in 1989 found that authors citeabout 30% of their influences.2. Biased citing. In the same study, authors found that somefacts were correctly cited, others were never credited orcredited to secondary sources.3. Self-citing4. Citations : some are affirmative, others may be negative.Authors avoid giving negative credit.5. Different disciplines have different citation rates.
    • Pitfalls and imitations in CitationAnalysisPositive & negative citation not distinguishedIntended or unintended omissionBiased citing practicesInformal influences not includedAssumption onreasons for citing awork may not bevalidDifferent fieldshave differentcitation ratesSome types ofpublications tendto have highercitation ratesProblems inindexing, citingreferences, clericalerrors, mis-interpretationNoncomprehensivecoverage
    • Sources of Citation Data1. Web of Science (output, citation count)2. Sciverse Scopus (output, citation count)3. Google Scholar (output, citation count)4. Journal Citation Report (impact factor)5. Scimago (journal ranking)6. Essential Science Indicators (highly citedpapers, hot papers, etc)
    • Citation Database: What’s in itCitation Database A Citation Database B
    • The “Big three”: the overlap is quitemodestISI/WOS SCOPUS Google ScholarApprox. 12,000 journalsLimited coverage ofhumanities e.g. monographsnot includedLimited coverage of OAjournals & conference papers,despite some recent additionsMajority Anglo-Saxon inorigin; English language biasContains useful tools forauthor disambiguation (Linksto ResearcherID)Oldest – 1900s (for theSciences & Social Sciences)Approx. 18,000 titlesGreater geographic spreadthan WOS – 60% is outside U.S.Better inclusion of non-journalmaterial, e.g. conf. papersContains useful tools forauthor disambiguationLimited coverage, 1995Widest range of materialincluded although no list ofjournals includedGaps in the coverage ofpublishers’ archives; noindication of timescale coveredResults often containduplicates of the same article(e.g. pre-prints, post-prints)No way to distinguish betweenauthors with same initialsDifficult to search for a journalwhich has various titleabbreviations
    • Overlaps (as of early 2013)Google Scholar
    • Researcher Metrics : h-indexh-index : measure of productivity and influenceA researcher with an index of h has published hpapers each of which has been cited at least h times.Name h-indexE. Witten 110A. J. Heeger 107M. L. Cohen 94A.C. Gossard 94Source : Hirsch, J. E. (2005). An index to quantify an individual’s scientificresearch output. PNAS, 102(46), 16569-16572.Name h-indexS.H. Snyder 191D. Baltimore 160R. C. Gallo 154P. Chambon 153Physics Life Sciences
    • 16Strengths of h-index simple to calculate combines output and impact depicts “durable” performance and not single achievementsWeaknesses of h-index discriminates against young researchers will not capture small but high-quality output may not depict recent performance h will never decline so one can “rest on one’s Laurels.”
    • Journal MetricsA. Journal Citation Report (Web of Knowledge)Science edition – over 8,000 journalsSocial Sciences edition – over 2,600 journals• Impact factor• Five year impact factor• Cited half-life• Immediacy index
    • JCR – Impact FactorThe journal Impact Factor (IF) is the average number of times articles from thejournal published in the past two years have been cited in the JCR year.IF of Journal A (2011) =No. of times 2009 & 2010 papers were cited in 2011 (Web of Science)No. of citable papers in Journal A published in 2009 & 2010In short …An Impact Factor of 1.0 means that, on average, the articles published one or twoyears ago have been cited one time.An Impact Factor of 2.5 means that, on average, the articles published one or twoyears ago have been cited two and a half times.Higher citations rate means your article has higher chances of getting cited
    • Journal MetricsB. Scimago Journal Ranking - Uses Sciverse Scopus dataIt expresses the average number of weighted citations received inthe selected year by the documents published in the selectedjournal in the three previous years.Takes into account the number of citations received by the journal,and the prestige of the journals where the citations came from.
    • Journal Impact Factor Game1. Editors requiring authors to include papers from the journal or affiliatedjournals in their references2. Reject negative studies, regardless of quality3. Publishing mainly popular science articles that deal with hot topics4. Publishing summaries of articles with relevant citations to them5. Publishing articles that add citations but not counted as citable6. Favour review article over original papersSource : M.E. Falagas and V.G. Alexiou. (2008). The top ten in journal impactfactor manipulation. Arch. Immunol. Ther. Exp. 56, 223-226.
    • Journal Titles Suppressed in JCR• JCR monitors journal self citations. <20% is considered acceptable.• List of titles suppressed for the year are reported in the release notes ofeach JCR version.• What titles are suppressed? “Suppressed titles were found to haveanomalous citation patterns resulting in a significant distortion of theJournal Impact Factor, so that the rank does not accurately reflect thejournal’s citation performance in the literature. ”
    • Institutional Rankings1. Times Higher Education (THE) World University Rankings2. QS World University Rankings3. Academic Ranking of World Universities (AWRU)Some of the players are :
    • ACTIVITY 2Identify institutions ranked 1-5, 6-10, 11-15, 16-20, 21-25, etc from the 3different rankings. Take note of names of institutions if they appear in all 3rankings.12345THE 2012 - 2013 QS 2012 AWRU 2012
    • Top 10THE World University Rankings2012-2013, released Oct 2012QS World University Rankings2012, released Sep 2012Academic Ranking of WorldUniversities 2012, released Aug 20121 Caltech MIT Harvard University2 Stanford University University of Cambridge Stanford University3 University of Oxford Harvard University MIT4 Harvard Univ University College London University of California Berkeley5 MIT University of Oxford University of Cambridge6 Princeton University Imperial College London Caltech7 University of Cambridge Yale University Princeton University8 Imperial College London University of Chicago Columbia University9 University of California Berkeley Princeton University University of Chicago10 University of Chicago Caltech University of OxfordInstitutions in red occur in all 3 rankings
    • Ranked 11 – 2011 Yale University Columbia University Yale University12 ETH Zurich University of Pennsylvania University of California, LA13 University of California, LA ETH Zurich Cornell University14 Columbia University Cornell University University of Pennsylvania15 University of Pennsylvania Stanford University University of California, San Diego16 John Hopkins University John Hopkins University University of Washington17 University College London University of Michigan John Hopkins University18 Cornell University McGill University University of California, San Francisco19 Northwestern University University of Toronto University of Wisconsin - Madison20 University of Michigan Duke University The University of TokyoInstitutions in purple occur in all 3 rankings
    • Ranked 21-3021 University of Toronto University of Edinburgh University College London22 Carnegie Mellon University University of CaliforniaBerkeleyUniversity of Michigan23 Duke University University of Hong Kong ETH Zurich24 University of Washington Australian National University Imperial College London25 University of Texas at Austin National University ofSingaporeUniversity of Illinois at Urbana-Champaign26 Georgia Institute of Technology Kings College London Kyoto University27 University of Tokyo Northwestern University New York UniversityUniversity of Toronto28 University of Melbourne University of Bristol29 National University of Singapore Ecole Polytechnique Federalede LausanneUniversity of Minnesota30 University of British Columbia University of Tokyo Northwestern University
    • THE’S Methodology (since 2010)Times Higher Education World University Rankings13 performance indicators, grouped under 5 areas.Consider core missions of the institutions : teaching, research,knowledge transfer, international outlook.1 Teaching : learning environment 30%. Reputation survey, staff tostudent ratio, undergrads-to-PhD degree awarded ratio, etc.2 Research : volume, income, and reputation 30%. University reputationaccounts for 17% and is based on academic reputation survey.Research productivity (scaled against staff numbers). 6%. Count numberof papers published in academic journals indexed by TR per academic.
    • THE Continued3 Citations : research influence 30%. Role of universities in spreading newknowledge and ideas. Number of times a university’s published workshave been cited by others. Data is provided by Thomson Reuters. Paperspublished between 2006 – 2010, citations counted for period between2006 to 2011. Data is normalised to reflect variations in citation volumebetween different disciplines.4 Industry income : innovation, knowledge transfer 2.5%. how muchresearch income an institution earns from industry.5 International outlook : staff, students, and research 7.5%. What isinternational outlook? Diversity on campus and international researchcollaborations. Ability of university to attract international students &faculty. PLUS, proportion of papers, over the 5 years, that have at leastone international co-author.
    • QS (since 2011)• Academic reputation from global survey 40%• Employer reputation from global survey 10%• Citations per faculty (data from Scopus) 20%. Latest 5 years ofdata is used. Total citation count factored against number offaculty, thus taking into account the size of institution. Selfcitations excluded since 2011.• Faculty student ratio 20%• Proportion of international students 5%• Proportion of international faculty 5%
    • AWRU (since 2003)Criteria Indicator Code WeightQuality of EducationAlumni of an institution winning NobelPrizes and Fields MedalsAlumni 10%Quality of FacultyStaff of an institution winning NobelPrizes and Fields MedalsAward 20%Highly cited researchers in 21 broadsubject categoriesHiCi 20%Research OutputPapers published in Nature andScience*N&S 20%Papers indexed in Science CitationIndex-expanded and Social ScienceCitation IndexPUB 20%Per Capita PerformancePer capita academic performance of aninstitutionWeighted scores of the above 5indicators divided by the number of FTEacademic staff.PCP 10%Total 100%* For institutions specialized in humanities and social sciences such as London School of Economics, N&S is not considered, and the weight of N&S is relocated to other indicators.
    • How Does This Affect Our Users?Research Output(productivity)Citation Count, h-index, g-index, etc(influence)USER USES OF DATA & METRICSFaculty Grant proposal, performance appraisal, promotion &tenure (research impact)Early CareerResearcherDemonstrate research impact & seeking employment inacademic / research institutionsSchool & Admin staff Selection of staff (recruitment, promotion, tenure),selection of external examiners, internal ranking journals(Tier 1, 2, 3 journals)University & Country Institutional rankings, ROI
    • Transformations
    • ActivitiesTo access Journal Citation Report, start from Library Toolbar or Libraryhome page1. Click on “Databases”2. Click on “Citation Databases”3. Click on “Journal Citation Reports”4. Click on “I Accept”To access Scimago, search in google and select Scimago Journal &Country Rank