Harassing with Numbers: the Uses and Abuses of Bureaucracy and Bibliometry

355 views

Published on

by Giuseppe De Nicolao.
Presented at the 13th European Control Conference, Lunch Session - Friday, June 27, 2014 (http://www.ecc14.eu/lunch-sessions.html).

Quantitative measures of academic performance are playing an ever more important role in every day’s academic life. Numerical indicators are key ingredients of the "reputation race" exemplified by rankings of institutions, journals and researchers. In an era of budget cuts, institutions and individuals must also resort to quantitative indicators in order to prove "accountable" and justify their cost to the public. This may involve an excessive increase of bureaucratic burdens to the point of harming the overall efficiency of teaching and research. Moreover, the question arises whether scientific productivity of scholars can be quantitatively and accurately measured by means of individual bibliometry. In some countries, there is a clear trend towards the normative adoption of bibliometric indicators at all levels, ranging from national research assessments to decisions regarding individuals, such as hiring and promotion. Is this feasible? What are the caveats and ethical risks? What has the scientometric literature to say and what are the international experiences? After the fall of ebony towers are we doomed to a bureaucratic and bibliometric deluge?

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
355
On SlideShare
0
From Embeds
0
Number of Embeds
5
Actions
Shares
0
Downloads
13
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Harassing with Numbers: the Uses and Abuses of Bureaucracy and Bibliometry

  1. 1. Harassing with Numbers: the Uses and Abuses of Bureaucracy and Bibliometry Giuseppe De Nicolao Università di Pavia
  2. 2. • Quality assurance, accountability, rankings, efficient allocation of resources, objective measures of performance, ... • Boring stuff, lots of bureaucracy .. but changes our lives and impacts also research and teaching • Let’s begin with a story
  3. 3. Outline • The most accurate ranking ever produced • Should you believe in university rankings? • Governing by numbers • The end is not near. It is here • The Good, the Bad, and the Ugly • The quest for the Holy Grail • You’ll gonna hear me roar!
  4. 4. THE MOST ACCURATE RANKING EVER PRODUCED
  5. 5. September 16 2010
  6. 6. New York Times, November 14, 2010 Alexandria’s surprising prominence was actually due to “the high output from one scholar in one journal” — soon identified on various blogs as Mohamed El Naschie, an Egyptian academic who published over 320 of his own articles in a scientific journal of which he was also the editor.
  7. 7. November 26 2008 flashback ...
  8. 8. December 2009
  9. 9. • ... of the 400 papers by El Naschie indexed in Web of Science, 307 were published in Chaos, Solitons and Fractals alone while he was editor- in-chief. • El Naschie’s papers in CSF make 4992 citations, about 2000 of which are to papers published in CSF, largely his own.
  10. 10. All ingredients in one story • reputation race at work • “the most accurate picture ... ever produced” • gaming affecting: – university rankings – journal Impact Factor – individual bibliometrics Objection: this is just an outlier Be serious: no one can be so stupid to let important decisions depend on questionable rankings
  11. 11. wait ...
  12. 12. Who is a “highly skilled migrant” in the Netherlands? Decided by the rankings
  13. 13. Highly skilled migrants Can I become a highly skilled migrant in the Netherlands - even if I haven't got a job yet? To be eligible, you must be in possession of one of the following diplomas or certificates: • a master's degree or doctorate from a recognised Dutch institution of higher education or • a master's degree or doctorate from a non-Dutch institution of higher education which is ranked in the top 150 establishments in either the Times Higher Education 2007 list or the Academic Ranking of World Universities 2007 issued by Jiao Ton Shanghai University in 2007
  14. 14. Sardegna (an Italian region): am I eligible for a scholarship to attend a PhD? Decided by the rankings
  15. 15. APPLICATION WILL BE SCORED BASED ON PRESTIGE OF PHD SCHOOL ACCORDING TO QS WORLD UNIVERSITY RANKINGS
  16. 16. Outline • The most accurate ranking ever produced • Should you believe in university rankings? • Governing by numbers • The end is not near. It is here • The Good, the Bad, and the Ugly • The quest for the Holy Grail • You’ll gonna hear me roar!
  17. 17. SHOULD YOU BELIEVE IN UNIVERSITY RANKINGS?
  18. 18. Shanghai ranking: indicators
  19. 19. The “normalization trap” 1/2
  20. 20. The “normalization trap” 2/2
  21. 21. Should you believe in the Shanghai ranking? An MCDM view J.-C. Billaut D. Bouyssou P. Vincke • all criteria used are only loosely connected with what they intended to capture. • several arbitrary parameters and many micro-decisions that are not documented. • flawed and nonsensical aggregation method • «the Shanghai ranking is a poorly conceived quick and dirty exercise» «any of our MCDM student that would have proposed such a methodology in her Master’s Thesis would have surely failed according to our own standards»
  22. 22. HUNDREDS OF UNIVERSITIES WITH SIMILAR SCORES SCORE SHANGHAI RANKING: HOW MUCH RELIABLE? RANK
  23. 23. THE RANKING SHANGHAIRANKING
  24. 24. Intermezzo A PRACTICAL GUIDE TO CLIMBING (THE RANKINGS)
  25. 25. Twenty Ways to Rise in the Rankings (1/3) by Richard Holmes http://rankingwatch.blogspot.it/2013/12/twenty-ways-to-rise-in-rankings-quickly.html 1. Get rid of students. The university will therefore do better in the faculty student ratio indicators. 2. Kick out the old and bring in the young. Get rid of ageing professors, especially if unproductive and expensive, and hire lots of temporary teachers and researchers. 5. Get a medical school. Medical research produces a disproportionate number of papers and citations which is good for the QS citations per faculty indicator and the ARWU publications indicator. Remember this strategy may not help with THE who use field normalisation.
  26. 26. 7. Amalgamate. What about a new mega university formed by merging LSE, University College London and Imperial College? Or a tres grande ecole from all those little grandes ecoles around Paris? 9. The wisdom of crowds. Focus on research projects in those fields that have huge multi - “author” publications, particle physics, astronomy and medicine for example. Such publications often have very large numbers of citations. 10. Do not produce too much. If your researchers are producing five thousand papers a year, then those five hundred citations from a five hundred “author” report on the latest discovery in particle physics will not have much impact. Twenty Ways to Rise in the Rankings (2/3) by Richard Holmes http://rankingwatch.blogspot.it/2013/12/twenty-ways-to-rise-in-rankings-quickly.html
  27. 27. 13. The importance of names. Make sure that your researchers know which university they are affiliated to and that they know its correct name. Keep an eye on Scopus and ISI and make sure they know what you are called. 18. Support your local independence movement. Increasing the number of international students and faculty is good for both the THE and QS rankings. If it is difficult to move students across borders why not create new borders? 20. Get Thee to an Island. Leiden Ranking has a little known ranking that measures the distance between collaborators. At the moment the first place goes to the Australian National University. Twenty Ways to Rise in the Rankings (3/3) by Richard Holmes http://rankingwatch.blogspot.it/2013/12/twenty-ways-to-rise-in-rankings-quickly.html
  28. 28. Rankings • Fragile scientific grounds • Cost of providing data • Incentive to gaming • Raw data are obscured Why, then?
  29. 29. Outline • The most accurate ranking ever produced • Should you believe in university rankings? • Governing by numbers • The end is not near. It is here. • The Good, the Bad, and the Ugly • The quest for the Holy Grail • You’ll gonna hear me roar!
  30. 30. GOVERNING BY NUMBERS
  31. 31. Rankings are based on composite indicators Science or pseudo-science?
  32. 32. Conflicting opinions • Non-aggregators: key objection to aggregation: the arbitrary nature of the weighting process by which the variables are combined • Aggregators: value in combining indicators: extremely useful in garnering media interest and hence the attention of policy makers
  33. 33. Can you really govern by numbers? Let’s see a survey of reactions to university rankings
  34. 34. Germany • “We look back decades and people came to German universities; today they go to US universities.” • The Exzellenzinitiative (2005): from traditional emphasis on egalitarianism towards competition and hierarchical stratification
  35. 35. France • The Shanghai ranking “generated considerable embarrassment among the French intelligentsia, academia and government: the first French higher education institution in the ranking came only in 65th position, mostly behind American universities and a few British ones”
  36. 36. Australia • The Shanghai and QS: at least two Australian universities among the top 100. • Opposing strategic options: – fund a small number of top-tier competitive universities – “creation of a diverse set of high performing, globally-focused institutions, each with its own clear, distinctive mission”.
  37. 37. Japan • “The government wants a first class university for international prestige ” • “in order for Japanese HEIs to compete globally, the government will close down some regional and private universities and direct money to the major universities” • some institutions will become teaching only.
  38. 38. The Education and University Minister: «We are lagging behind in the world rankings. For this reason we are going to present the reform of the University [...] I wish I will never see again that the first Italian university is ranked 174-th » Italy
  39. 39. OPERATING EXPENSES YEARLY STATE FUNDING 2012 BILLIONEURO HARVARD’S YEARLY OPERATING EXPENSES = 44% OF YEARLY FUNDING OF ALL ITALIAN STATE UNIVERSITIES
  40. 40. E. Hazelkorn: “Estimated yearly budget of €1.5 billion to be ranked in the world’s top 100”
  41. 41. Why obsessing about the “top 1%”? OTHER 16,500 UNIVERSITIES TOP 1%
  42. 42. Answer: trickle-down knowledge!
  43. 43. E. Hazelkorn on rankings • 90% or 95% of our students do not attend elite institutions. Why are we spending so much on what people aren’t attending as opposed to what they are attending? • May detract resources from pensions, health, housing, .... • Are “elite” institutions really driving national or regional economic and social development?
  44. 44. Does trickle-down work? E. Hazelkorn: “Governments and universities must stop obsessing about global rankings and the top 1% of the world's 15,000 institutions. Instead of simply rewarding the achievements of elites and flagship institutions, policy needs to focus on the quality of the system-as-a-whole.” There is little evidence that trickle-down works.
  45. 45. Where are we? • (Even) Phil Baty (Times Higher Education) admits that there are aspects of academic life where rankings are of little value • Can we/you afford the ‘reputation race’? • We will have to live in a world in which extremely poor rankings are regularly published and used. What can be done then?
  46. 46. some advices from the authors of “Should you believe in the Shanghai Ranking?”
  47. 47. “Stop being naive” • There is no such thing as a ‘‘best university’’ in abstracto. • Stop talking about these ‘‘all purpose rankings’’. They are meaningless. • Lobby in our own institution so that these rankings are never mentioned in institutional communication
  48. 48. utopistic?
  49. 49. U.S. News College Rankings are very influential. Nevertheless ...
  50. 50. from “Is There Life After Rankings?” • Not cooperating with the rankings affects my life and the life of the college in several ways. Some are relatively trivial; for instance, we are saved the trouble of filling out U.S. News's forms, which include a statistical survey that has gradually grown to 656 questions • The most important consequence of sitting out the rankings game, however, is the freedom to pursue our own educational philosophy, not that of some newsmagazine.
  51. 51. Outline • The most accurate ranking ever produced • Should you believe in university rankings? • Governing by numbers • The end is not near. It is here • The Good, the Bad, and the Ugly • The quest for the Holy Grail • You’ll gonna hear me roar!
  52. 52. THE END IS NOT NEAR ... IT IS HERE
  53. 53. THE END IS NOT NEAR. IT IS HERE August 22, 2011
  54. 54. WHAT’S THE SOURCE?
  55. 55. Not only Italy: all European science collapsed from 2008 to 2009!
  56. 56. ARMAGEDDON DAY FOR EUROPEAN RESEARCH? WAIT A MINUTE....
  57. 57. Let us ask SCOPUS: no evidence of collapse Italy’s scientific documents 1996-2010
  58. 58. No trace of collapse fo all other countries (Europe and non-Europe) Then, what’s the explanation of the graph in the scientific paper?
  59. 59. EXAMPLE: DUE TO WELL KNOWN RECORDING DELAYS IN BIBLIOMETRIC DATABASES, IN 2010 THE NATIONAL SCIENCE FOUNDATION REGARDED 2008 E 2009 DATA AS UNRELIABLE IT’S JUST A MATTER OF DELAYS
  60. 60. The moral of the story Bibliometric data of last two years are not in steady-state: do not use for scientific (or assessment) purposes
  61. 61. 1. For the (bibliometric) bureaucrat a number is something objective and trustable. 2. Awareness about errors, uncertainty, relevance, manipulability is usually very low. 3. Administrative and normative use of bibliometry is extremely fragile. 4. In the last two years, hundreds if not thousands of Italian researchers spent a lot of time asking Web-of-Science and Scopus to update/correct their bibliometric profile. Comment
  62. 62. Outline • The most accurate ranking ever produced • Should you believe in university rankings? • Governing by numbers • The end is not near. It is here • The Good, the Bad, and the Ugly • The quest for the Holy Grail • You’ll gonna hear me roar!
  63. 63. Italy: three actors 1. Political power 2. Higher Education System 3. Evaluation Agency
  64. 64. Italy: Budget for curiosity driven research national projects [PRIN]
  65. 65. ON THE OTHER HAND, WHY SHOULD WE PAY SCIENTISTS?
  66. 66. Some other numbers
  67. 67. Higher education expenditure (% GNP): Source: OECD 2013
  68. 68. ITALY %Populationwithtertiary educationattainment(30-34yrs) EU-28: tertiary education
  69. 69. Research efficiency (red: ITALY)
  70. 70. Hence, what’s the priority? Here is the answer by ANVUR • Low expenditure • % tertiary degrees: last in Europa • good research efficiency
  71. 71. SERGIO BENEDETTO (CONSIGLIO DIRETTIVO ANVUR) 4–02-2012 “We will give report cards to professors in order to rank universities ...
  72. 72. SERGIO BENEDETTO (CONSIGLIO DIRETTIVO ANVUR) 4–02-2012 ... and some sites will have to be closed”
  73. 73. Big question: how can you give report cards to professors?
  74. 74. Outline • The most accurate ranking ever produced • Should you believe in university rankings? • Governing by numbers • The end is not near. It is here • The Good, the Bad, and the Ugly • The quest for the Holy Grail • You’ll gonna hear me roar!
  75. 75. THE QUEST FOR THE HOLY GRAIL
  76. 76. The Holy Grail of research assessment • Peer review is subjective, lengthy, and expensive • We have a lot of bibliometric data relative to journals (e.g. IF) and scientists: – # papers – # cites – h-index – ... • Solution: work out indicators to obtain objective, quick and unexpensive bibliometric assessment also at individual level
  77. 77. the problem is not a new one ...
  78. 78. Assessing poetry ... determining a poem's greatness becomes a relatively simple matter. If the poem's score for perfection is plotted along the horizontal of a graph, and its importance is plotted on the vertical, then calculating the total area of the poem yields the measure of its greatness.
  79. 79. Assessing research (the Italian way) IMPACT FACTOR #CITATIONS
  80. 80. For the matrix entries labeled IR we rely on the informed peer review ANVUR proposal: Use bibliometry, # of citations (and informed peer review) 97 A B C D Citations A B C D A B C D A B C D Citations A B C D A B C D A A A? D D D A A A? D IR IR IR IRIR IRIR IR IR IR IR IR IRIR Bibliometry (IF,…) Bibliometry (IF,…) Recent articles Old articles
  81. 81. Research as target shooting A paper is an arrow aiming at high IF & cites
  82. 82. E = 1 B = 0,8 A = 0,5 L = 0 ITALIAN RESEARCH ASSESSMENT: COLORS AND SCORESITALIAN RESEARCH ASSESSMENT: COLORS AND SCORES
  83. 83. 20% 20% 10% 50% Target specification BEST WORST
  84. 84. Was target specification respected? No!
  85. 85. Due to a flawed design, the actual targets did not match the specification and did differ between scientific areas Target specification
  86. 86. Medical Sc. vs Industr. & Inform. Eng. 40% 25% 14% 21% 22% 21% 13% 44% Ingegneria Industriale e dell’InformazioneScienze Mediche Industr. & Information Eng. Medical Sciences
  87. 87. The moral of the story: since the targets in different areas (and disciplines!) are not matched, scores are not comparable and any subsequent aggregation (e.g. for funding) becomes nonsensical. We failed the quest, but the Grail may still exist ...
  88. 88. and the Holy Grail?
  89. 89. Report on the pilot exercise to develop bibliometric indicators for the REF [the research assessment] Bibliometrics are not sufficiently robust at this stage to be used formulaically or to replace expert review in the REF http://www.hefce.ac.uk/pubs/year/2009/200939/ bibliometry And the Holy Grail? Let’s ask HEFCE
  90. 90. Code of Practice – European Mathematical Society http://www.euro-math-soc.eu/system/files/COP-approved.pdf 1.... the Committee sees grave danger in the routine use of bibliometric and other related measures to assess the alleged quality of mathematical research and the performance of individuals or small groups of people. 2.It is irresponsible for institutions or committees assessing individuals for possible promotion or the award of a grant or distinction to base their decisions on automatic responses to bibliometric data. And the Holy Grail? Let’s ask EMS
  91. 91. On the use of bibliometric indices during assessment – European Physical Society http://www.eps.org/news/94765/ ... the European Physical Society considers it essential that the use of bibliometric indices is always complemented by a broader assessment of scientific content taking into account the research environment, to be carried out by peers in the framework of a clear code of conduct. And the Holy Grail? Let’s ask EPS
  92. 92. And the Holy Grail? Let’s ask Académie des Sciences Du Bon Usage de la Bibliometrie pour l’Évaluation Individuelle des Chercheurs”- Institut de France, Académie des Sciences http://www.academie-sciences.fr/activite/rapport/avis170111gb.pd Any bibliometric evaluation should be tightly associated to a close examination of a researcher’s work, in particular to evaluate its originality, an element that cannot be assessed through a bibliometric study.
  93. 93. And the Holy Grail? Let’s ask IEEE IEEE Board of Directors: Position Statement on “Appropriate Use of Bibliometric Indicators for the Assessment of Journals, Research Proposals, and Individuals”. http://www.ieee.org/publications_standards/publications/rights/iee Any journal-based metric is not designed to capture qualities of individual papers and must therefore not be used alone as a proxy for single-article quality or to evaluate individual scientists.
  94. 94. http://am.ascb.org/dora/ 10,963 individual signers; 484 organizations And the Holy Grail? Let’s ask DORA
  95. 95. 1. Avoid using journal metrics to judge individual papers or individuals for hiring, promotion and funding decisions. 2. Judge the content of individual papers and take into account other research outputs, such as data sets, software and patents, as well as a researcher’s influence on policy and practice.
  96. 96. Signed by 484 organizations including: - American Association for the Advancement of Science (AAAS) - American Society for Cell Biology - British Society for Cell Biology - European Association of Science Editors - European Mathematical Society - European Optical Society - European Society for Soil Conservation - Federation of European Biochemical Societies - Fondazione Telethon - Higher Education Funding Council for England (HEFCE) - Proceedings of The National Academy Of Sciences (PNAS) - Public Library of Science (PLOS) - The American Physiological Society - The Journal of Cell Biology - Institute Pasteur - CNRS – University Paris Diderot - INGM, National Institute of Molecular Genetics; Milano, Italy - Université de Paris VIII, France - University of Florida - The European Association for Cancer Research (EACR) - Ben-Gurion University of the Negev - Université de Louvain
  97. 97. And the Holy Grail? Let’s ask the literature Interpretation and Impact ”... analysts should also be aware of the potential effect of the results in terms of future behavioural changes by institutions and individuals seeking to improve their subsequent 'ranking'."
  98. 98. Outline • The most accurate ranking ever produced • Should you believe in university rankings? • Governing by numbers • The end is not near. It is here • The Good, the Bad, and the Ugly • The quest for the Holy Grail • You’ll gonna hear me roar!
  99. 99. OU’LL GONNA HEAR ME ROA
  100. 100. • Media provide distorted information about expenditure, performance and efficiency of higher education. • This justified heavy budget cuts (- 18,7% from 2009 to 2013 in real terms) • “Bureaucratic delirium” • Flawed bibliometric assessments incentivating gaming What to do? Italy
  101. 101. A blog devoted to research evaluation and higher-education policy •Birth: October 2011 •Members of the Editorial Board: 14 •Collaborators: > 200 •Contacts from November 2011 to June 2014: 8 million •More than 13,000 daily contacts in 2014 •Articles published: 1.627 •25,000 comments by readers •Funding: donations from the readers •Often cited by national newspapers and magazines •Good visibility among cultural blogs (see e.g. 8-th position in http://labs.ebuzzing.it/top-blogs/cultura)
  102. 102. Thank you!

×