Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Metrics are rubbish, but ...

317 views

Published on

Presentation at "Doing metrics responsibly: how far have we come and what are the next steps" SIG at Association of Research Managers and Administrators (ARMA) Annual Conference, Liverpool, 7th June 2017

Panel presentation based on my experince as a member of the computer science and informatics subpanel at REF2014, and subsequent metrics-based analysis of public-domain REF data, which revealed order of magnitude biases.

My final conclusion is:

metrics are rubbush
but ...
people are far worse

http://alandix.com/academic/papers/ARMA-2017-metrics/

Published in: Education
  • Be the first to comment

  • Be the first to like this

Metrics are rubbish, but ...

  1. 1. metrics are rubbish but … Alan Dix University of Birmingham and Talis http://alandix.com/ref2014/ Doing metrics responsibly ARMA Liverpool 2017
  2. 2. Who: Alan Dix first love mathematics all of it! … including numerical models and statistics academic for 35 years human computer interaction + all sorts REF 2014 panel member SP11 Computer Science and Informatics
  3. 3. using metrics to evaluate individuals edge cases, long-term work departments and institutions aggregate stats, but skew distributions impact on academic policy ?
  4. 4. using metrics to evaluate individuals edge cases, long-term work departments and institutions aggregate stats, but skew distributions evaluation process – REF! even larger numbers, 1*– 4* buckets de-skew ? impact on academic policy
  5. 5. bias in metrics? less-research intensive universities smaller contact networks poor choice of venue applied or cross-disciplinary research impact outside academia harder to track citations
  6. 6. what – REF public domain data (virtually) complete list of outputs: excluding a few confidential ones for each: name, doi, ACM topic area, Scopus citations (where used) Google scholar citations for each gathered after REF (not used in assessment) UoA profiles sub-area profiles N.B. highlighted data SP11 only
  7. 7. how – bibliometrics citation analysis – Scopus & Google Scholar N.B. metrics to validate evaluation ≠ metrics for evaluation large numbers 500–1000 per sub area low 100s per institution
  8. 8. results – scary apparent emergent bias – 5-10 fold! sub-areas: theory vs. applied institutions: Russell Group vs pre vs post 1992 + gender impact where metrics may be biased … … panel more so! having impact on policy recruitment, investment
  9. 9. in summary …
  10. 10. Alan Dix University of Birmingham and Talis http://alandix.com/ref2014/ Doing metrics responsibly ARMA Liverpool 2017 metrics are rubbish but … people are worse (far)

×