• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Russell Group PVCs 3 Jun 09
 

Russell Group PVCs 3 Jun 09

on

  • 870 views

Presentation by Michael Jubb to Russell Group Pro-Vice Chancellors, June 2009

Presentation by Michael Jubb to Russell Group Pro-Vice Chancellors, June 2009

Statistics

Views

Total Views
870
Views on SlideShare
870
Embed Views
0

Actions

Likes
0
Downloads
1
Comments
0

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Russell Group PVCs 3 Jun 09 Russell Group PVCs 3 Jun 09 Presentation Transcript

    • Research, Publication, Management …….and the REF Michael Jubb Research Information Network Russell Group PVCs (Research) Working Group 3 June 2009
    • What are we trying to measure/assess?
      • productivity
        • individual, institutional and national volumes and shares of outputs
      • research impact
        • citations
        • readership and usage
      • networks
        • who is reading, citing, linking to whom
      • socio-economic impact
        • but what precisely and over what time-frame?
    • Researchers’ publication and dissemination behaviour
      • study commissioned by RIN
      • broad aim to gather and analyse evidence about:
        • the motivations, incentives and constraints that lead researchers in different subjects and disciplines to publish and disseminate their work in different ways and at different times;
        • how and why researchers cite other researchers’ work;
        • how researchers’ decisions on publication and citation are influenced (or not) by considerations arising from research assessment.
    • What kinds of outputs?
      • journals dominant across all disciplines (ex arts)
      • diverse range of other outputs
        • patents, instrument building, databases, web-based resources, working papers………
      sciences, 0-1% soc sci, arts & humanities, 0-1% ? reviews, reports……… sciences, 0% soc sci, arts & humanities, 0-35% x exhibitions, performances etc sciences, 0-14% soc sci, arts & humanities, 0-5% ? conference papers sciences, 0-3% soc sci, arts & humanities 22-88% ? books, chapters sciences, 79-99% soc sci, arts & humanities, 22-88%  journal articles RAE 2008 %
    • Some measures of productivity Values shown for UK percentage presence in different reports
    • Why the differences?
      • data sources
        • WoS
          • SCI, SSCI, AHCI
          • database version
          • include letters and conference proceedings?
        • SCOPUS
      • year to be counted
        • year published in print or online?
      • counting method
        • integer counting
        • fractional counting
    • So what are researchers telling us? where, when and how to publish
      • key motivation is recognition by peers
        • peer review critically important
        • recognition measured by citation
        • career advancement
      • secondary motivation is maximising dissemination
        • tension between targeting best audience and highest quality journal
      • increasing collaboration more co-authorship
        • significant rise in proportion of multi-authored works between 2003 and 2008
      • research assessment affects choices
      • signs of increase in productivity
        • small rise in no. of articles per author 2003-2008
    • Disciplinary differences?
      • books/book chapters equal in importance to journals in humanities
      • conference papers important in engineering
      • concerns about practice-led outputs
        • creative and performing arts
        • applied disciplines such as (aspects of) psychology, nursing and midwifery
    • So what are researchers telling us? citation behaviour (citations out)
      • some evidence of increase in volume of citations
      • varying reasons for citing others/types of citation
      • associated with types of output
        • articles, books, conference papers etc
      • motivated by
        • authority of cited material (64%) or of author (44%)
        • requirement to reference a method/theory/argument (53%)
        • guidance from others: mainly reviewers and editors (29%)
      • self citation of multi-authored papers
      • disciplinary differences
        • medical sciences tend not to cite conference papers
        • humanities cite personal communications and anecdotal refs
    • Some citation measures (citations in)
      • no. of citations
        • including and excluding self-citations
      • average no. of citations per publication
      • % of publications not cited
      • normalised worldwide average no. of citations per publication for a specific field
        • percentile breakdowns
      • comparisons of individuals, groups, departments, institutions etc against normalised worldwide average for a specific field
      • quality profiles and percentile breakdowns
    • Some normalised worldwide averages 0.55 0.51 history 1.12 1.06 sociology 2.25 2.06 psychology 1.34 1.13 economics 1.11 1.21 energy 0.71 0.82 civil engineering 0.88 0.74 maths 3.18 2.52 chemistry 5.25 4.23 astronomy 4.83 4.02 biomedical sciences 3.47 2.69 biological sciences 2.51 2.24 agriculture and food UK average (without self-citations) worldwide average 2003-06
    • Some problems: coverage of the literature Internal WoS coverage Soc Sci in Medicine Physics & Astronomy Geosciences Clinical Medicine Engineering Psychology & Psychiatry Chemistry Humanities & Arts Economics Bio Science – animals & plants Bio Science – humans Other Soc Sci Mathematics App Physics & Chemistry Molecular Biology & Biochemistry <40% 40-60% 60-80% 80-100%
    • Some problems: timing and citation half-life
      • Time-lags and skewed distributions
      • Disciplinary, and sub-disciplinary, differences
    • Some problems: multiple authorships
    • Other measures? the Hirsch index From: Lutz Bornmann (2006) H Index
    • Other measures?
      • network and page rank analysis
        • weighted transfers of prestige from one journal/researcher/institution to another
        • eigenfactor, SCImago journal ranking
      • usage measures
        • COUNTER metrics
        • network analysis
    • Do they tell the same story?
    • Do they tell the same story? Earth and planetary sciences in selected institutions
    • So what are researchers telling us?
      • Has dissemination behaviour been influenced by RAE? Will it be influenced by REF?
        • across all disciplines, Yes
        • senior academics less affected, early-to-mid career academics more so
        • focus on publication
        • institutional strategies
      • quarter of all researchers think the RAE excluded important research outputs
      • if REF based on citations, they will employ open access publishing more often (42%)
    • So what are researchers telling us?
      • Has citation behaviour been influenced by RAE?
        • across all disciplines (ex physical sciences), No or Not Sure
      • Will it be influenced by REF?
        • across all disciplines (ex economics), Yes, or Might
        • likely to cite collaborators’ work more often (38%) and competitors’ work less (13%)
    • Issues for REF
      • coverage of different kinds of outputs
        • disciplinary differences
      • accuracy of data
        • original citation
        • publication databases (publishers’ and institutions’)
      • definition of fields
        • interdisciplinarity
      • costs……….
      • selectivity or not?
      • who owns the publication or citation?
        • institution
        • individual
      • bottom up or top down analysis?
      • different metrics give different results
    • Relationship between metrics and peer review
      • “ The future of research assessment exercises lies in the intelligent combination of metrics and peer review”
      • Henk Moed, CWTS, Leiden University
      • peer review informed by bibliometrics,
      • or
      • bibliometrics moderated by peer review?
      • possibilities
        • let the type of peer review depend on the outcomes of the bibliometrics
        • use citation analysis for initial rankings and explicitly justify any subsequent deviations from them
    • Lessons for institutions?
      • even simple bibliometrics are not simple
        • and they are rapidly becoming more complex and sophisticated
        • need for bibliometric expertise to understand and be able to employ a range of measures
        • local, central or commercial services?
      • researchers’ motivations and behaviours are complex
        • need for assessments by others is implicit in all their motivations
        • rewards come from assessments; RAE/REF part of a wider ecology
        • disciplinary differences are real
        • institutional policies and strategies must take account of them
    • Lessons for institutions?
      • staff awareness and consultation
      • lessons from pilots
        • comprehensive research information systems
        • publications databases
          • not necessarily the same as the repository
        • accurate bibliographic data
      • author ID systems?
      • other lessons once current study completed?
      • Questions???
      • Michael Jubb
      • www.rin.ac.uk