Le politiche della ricerca al tempo dei rankings


Published on

Presentazione di Giuseppe De Nicolao al II Convegno Roars: “Higher Education and Research Policies in Europe: Challenges for Italy”, 21 febbraio 2014
CNR, Piazzale A. Moro 7, Roma

Published in: Education
  • Be the first to comment

  • Be the first to like this

Le politiche della ricerca al tempo dei rankings

  1. 1. Le  poli(che  della  ricerca     al  tempo  dei  rankings   Giuseppe  De  Nicolao   Università  di  Pavia  
  2. 2. SOMMARIO   Prologo:  “mamma  li  Turchi!”   1.  Dall’Egi=o  con  furore   2.  Should  you  believe  in  the  Shanghai  ranking?   3.  Numb3rs!   4.  No  Rankings?  No  party.   5.  The  power  of  numbers   6.  Where  are  we  going?  
  3. 3. Prologo   Mamma  li  Turchi!  
  4. 4. «Siamo agli ultimi posti nelle classifiche mondiali. Per questo motivo presenteremo a novembre la riforma dell’Università, [...] Mi auguro di non dover più vedere in futuro - conclude - la prima università italiana al 174mo posto»
  5. 5. Chi  è  un  “highly  skilled  migrant”?   Dipende  dai  ranking  
  6. 6. Highly  skilled  migrants   Can  I  become  a  highly  skilled  migrant  in  the  Netherlands  -­‐  even  if   I  haven't  got  a  job  yet?   To  be  eligible,  you  must  be  in  possession  of  one  of  the  following   diplomas  or  cerPficates:   •  a   master's   degree   or   doctorate   from   a   recognised   Dutch   insPtuPon  of  higher  educaPon  or   •  a   master's   degree   or   doctorate   from   a   non-­‐Dutch   insPtuPon   of   higher   educaPon   which   is   ranked   in   the   top   150   establishments  in  either  the  Times  Higher  EducaPon  2007  list   or  the  Academic  Ranking  of  World  UniversiPes  2007  issued  by   Jiao  Ton  Shanghai  University  in  2007  
  7. 7. Posso  avere  una  borsa  per  un   master  o  un  PhD?     Dipende  dai  ranking  
  8. 8. Un  professore  straniero     nel  collegio  del  do?orato?   Dipende  dai  ranking  
  9. 9. Capitolo  1   Dall’EgiGo  con  furore   Capitolo 1 Dall’Egitto con furore
  10. 10. rewind  ...     16  seGembre  2010  
  11. 11. New  York  Times,  November  14,  2010   Alexandria’s  surprising  prominence  was  actually   due  to  “the  high  output  from  one  scholar  in  one   journal”  —  soon  idenPfied  on  various  blogs  as   Mohamed  El  Naschie,  an  EgypPan  academic   who  published  over  320  of  his  own  ar(cles  in  a   scien(fic  journal  of  which  he  was  also  the   editor.                    
  12. 12. rewind  ...     26  novembre  2008  
  13. 13. •  ...  of  the  400  papers  by  El  Naschie  indexed  in   Web  of  Science,  307  were  published  in  Chaos,   Solitons  and  Fractals  alone    while  he  was  editor-­‐ in-­‐chief.     •  El  Naschie’s  papers  in  CSF  make  4992  cita(ons,   about  2000  of  which  are  to  papers  published  in   CSF,  largely  his  own.  
  14. 14. never  again?  
  15. 15. THE  ranking  2012:     oops,  I  did  it  again!    
  16. 16. THE  ranking  2012:     oops,  I  did  it  again!     •  Only  a  few  of  the  more  than  100  co-­‐authors  of  2008  and  2010   reviews  were  from  MEPhI   •  The  2008  parPcle  physics  review  received  nearly  300  Pmes  as   many  citaPons  in  the  year  acer  publicaPon  as  the  mean  for  that   journal   •  Cites  were  averaged  over  the  rela(vely  small  number  of  MEPhI’s   publicaPons  yielding  a  very  high  citaPon-­‐rate   •  Further,  if  citaPons  are  generally  low  in  their  countries,  then   insPtuPons  get  some  more  value  added  (regional  modificaAon)  
  17. 17. what  about  other  rankings?  
  18. 18. the  top  10  most     spectacular  errors  of  …   reviewed  by  University  Ranking  Watch    
  19. 19. QS  greatest  hits:  interna(onal  Students     and  Faculty  in  Malaysian  Universi(es   •  In  2004  UniversiP  Malaya  (UM)  in  Malaysia.   reached  89th  place  in  the  THES-­‐QS  world   rankings.   •  In  2005  came  disaster.  UM  crashed  100  places   •  PoliPcal  opposiPon:  shame  on  the  university   leadership!   •  Real  explanaPon:  lot  of  Malaysian  ciPzens  of   Indian  and  Chinese  descent  erroneously   counted  as  “foreigners”.  
  20. 20. QS  greatest  hits:  500  wrong  student   faculty  ra(os  in  2007  QS  Guide     •  Someone  slipped  three  rows  when  copying   and  pasPng  student  faculty  raPos:  Dublin   InsPtute  of  Technology  was  given  Duke’s  raPo,   Pretoria  got  Pune’s,  Aachen  RWT  got   Aberystwyth’s  (Wales).  And  so  on.  Altogether   over  500  errors.  
  21. 21. Let’s  go  technical  ...  
  22. 22. Capitolo  3   Should  you  believe  in  the  Shanghai  ranking?  
  23. 23. Shanghai:  criteri  e  importanza  %  
  24. 24. The  “normaliza(on  trap”  1/2  
  25. 25. The  “normaliza(on  trap”  2/2  
  26. 26. Should  you  believe  in  the  Shanghai  ranking?  An  MCDM  view     J.-­‐C.  Billaut  D.  Bouyssou  P.  Vincke     •  all  criteria  used  are  only  loosely  connected  with  what  they   intended  to  capture.   •  several  arbitrary  parameters  and  many  micro-­‐decisions    that  are   not  documented.     •  flawed  and  nonsensical  aggregaPon  method   •  «the  Shanghai  ranking  is  a  poorly  conceived  quick  and  dirty   exercise»    «any  of  our  MCDM  student  that  would  have  proposed  such  a   methodology  in  her  Master’s  Thesis  would  have  surely  failed   according  to  our  own  standards»    
  29. 29. Twenty  Ways  to  Rise  in  the  Rankings  (1/3)   by  Richard  Holmes  hGp://rankingwatch.blogspot.it/2013/12/twenty-­‐ways-­‐to-­‐rise-­‐in-­‐rankings-­‐quickly.html   1.    Get  rid  of  students.  The  university  will  therefore  do  be=er  in   the  faculty  student  raPo  indicators.   2.    Kick   out   the   old   and   bring   in   the   young.   Get   rid   of   ageing   professors,   especially   if   unproducPve   and   expensive,   and   hire   lots  of    temporary  teachers  and  researchers.   5.   Get   a   medical   school.   Medical   research   produces   a   disproporPonate   number   of   papers   and   citaPons   which   is   good   for   the   QS   citaPons   per   faculty   indicator   and   the   ARWU   publicaPons  indicator.  Remember  this  strategy  may  not  help  with   THE  who  use  field  normalisaPon..  
  30. 30. Twenty  Ways  to  Rise  in  the  Rankings  (2/3)   by  Richard  Holmes  hGp://rankingwatch.blogspot.it/2013/12/twenty-­‐ways-­‐to-­‐rise-­‐in-­‐rankings-­‐quickly.html   7.  Amalgamate.  What  about  a  new  mega  university  formed  by   merging  LSE,  University  College  London  and  Imperial  College?  Or   a   tres   grande   ecole   from   all   those   li=le   grandes   ecoles   around   Paris?   9. The   wisdom   of   crowds.   Focus   on   research   projects   in   those   fields   that   have   huge   mul(   -­‐   “author”     publica(ons,   parPcle   physics,  astronomy  and  medicine  for  example.    Such  publicaPons   ocen  have  very  large  numbers  of  citaPons.   10.    Do  not  produce  too  much.  If  your  researchers  are  producing   five   thousand   papers   a   year,   then   those   five   hundred   citaPons   from   a   five   hundred   “author”   report   on   the   latest   discovery   in   parPcle  physics  will  not  have  much  impact.    
  31. 31. Twenty  Ways  to  Rise  in  the  Rankings  (3/3)   by  Richard  Holmes  hGp://rankingwatch.blogspot.it/2013/12/twenty-­‐ways-­‐to-­‐rise-­‐in-­‐rankings-­‐quickly.html   13.    The  importance  of  names.  Make  sure  that  your  researchers   know  which  university  they  are  affiliated  to  and  that  they  know   its  correct  name.  Keep  an  eye  on  Scopus  and  ISI  and  make  sure   they  know  what  you  are  called.   18.  Support  your  local  independence  movement.  Increasing  the   number   of   internaPonal   students   and   faculty   is   good   for   both   the  THE  and  QS  rankings.  If  it  is  difficult  to  move  students  across   borders  why  not  create  new  borders?   20.    Get   Thee   to   an   Island.   Leiden   Ranking   has   a   li=le   known   ranking   that   measures   the   distance   between   collaborators.   At   the   moment   the   first   place   goes   to   the   Australian   NaPonal   University.  
  32. 32. Capitolo  4   Capitolo  3   Numb3rs!  
  33. 33. Let’s  open  the  box   UNIVERSITY RANKING RAW DATA
  34. 34. INTERNAZIONALIZZAZIONE   Fonte:  “Malata  e  Denigrata”  a  cura  di  M.  Regini,  Donzelli  2009  
  35. 35. Let’s  open  the  box   1.  2.  3.  4.  “ScienPfic  excellence”   Student  faculty  raPo   Job  market   Funding  
  36. 36. 1.  “Scien(fic  excellence”  
  37. 37. Classifiche  degli  atenei:     valore  scienAfico  assai  dubbio.   Come  si  può  misurare  il  peso  di  una  nazione  nel   panorama  scienAfico  internazionale?   Contando  gli  ar(coli  scien(fici  che  produce  e     le  citazioni  che  ques(  oGengono  
  38. 38. Italia:  8°  per  ar(coli  scien(fici   Fonte:  SCImago  su  daP  Scopus  1996-­‐2012    
  39. 39. PUBBLICAZIONI (WoS) 100000   90000   80000   70000   Regno  Unito   60000   Giappone   Germania   Francia   50000   Canada   Italia   40000   Spagna   30000   Olanda   20000   Svezia   Svizzera   10000   0   1985   1990   1995   2000   2005   2010  
  40. 40. 8   7   PUBBLICAZIONI 2004-2010: CRESCITA MEDIA ANNUA (%) 6   5   4   3   2   1   0   -­‐1   Fonte: VQR 2004-2010 – Rapporto Finale ANVUR, Giugno 2013 (Tab. 3.2) (dati ISI Web of Knowledge, Thomson-Reuters) http://www.anvur.org/rapporto/files/VQR2004-2010_RapportoFinale_parteterza_ConfrontiInternazionali.pdf
  41. 41. 6000000   5000000   4000000   PUBBLICAZIONI 2004-2010: NUMERO DI CITAZIONI 3000000   2000000   1000000   0   Fonte: VQR 2004-2010 – Rapporto Finale ANVUR, Giugno 2013 (Tab. 4.1) (dati ISI Web of Knowledge, Thomson-Reuters) http://www.anvur.org/rapporto/files/VQR2004-2010_RapportoFinale_parteterza_ConfrontiInternazionali.pdf
  42. 42. Efficienza:  Italia  baGe     Germania,  Francia  e  Giappone  
  43. 43. Ma come fanno questi italiani a produrre così tanta ricerca con così poche risorse? OCTOBER 2009
  44. 44. 2.  Student  faculty  ra(o  
  45. 45. Rapporto studenti/docenti: su 26 nazioni solo 5 stanno peggio di noi 26 countries INDONESIA CZECH REP. 1 2 SLOVENIA 5 BELGIUM ITALY SAUDI ARABIA 4 6 3
  46. 46. 3.  Job  market  
  47. 47. 4.  Funding  
  48. 48. NUMBER OF UNIVERSITIES Quanto è elitaria la “top 500”? OTHER 16,500 UNIVERSITIES TOP 500 PERFORMANCE ... e cosa costa stare in cima?
  50. 50. E.  Hazelkorn:   “EsKmated  yearly  budget     of  €1.5  billion  to  be  ranked     in  the  world’s  top  100”  
  51. 51. Spesa per università (% PIL): l’Italia è 30° su 33 (fonte: OCSE 2013)
  52. 52. e  ciò  nonostante  ...  
  53. 53. % di Atenei che entrano nei “top 500” (Leiden: top 250) CLASSIFICA: Fonte  dei  daK:  “Malata  e  denigrata  :  l’universita  italiana  a   confronto  con  l’Europa”  (a    cura  di  M.  Regini,  Roma,  Donzelli  2009)  
  54. 54. Capitolo  4   No ranking? No party.
  55. 55. Niente  classifica?     Niente  valutazione.   “Ogni  valutazione  deve  me[ere  capo  a  una   classifica.  Questa  è  la  logica  della   valutazione.  Se    c'  è  una  classifica,  non   c'  è  neanche  una  reale  valutazione”   Giulio  TremonP,  “Il  passato  e  il  buon  senso”  CdS  22-­‐08-­‐08  
  56. 56. ma  come  fanno     veramente  gli  inglesi  ?   Per  rispondere  andiamo  alle  fon(   (la  “VQR  inglese”)  
  57. 57. NO RANKINGS PLEASE! WE’RE ENGLISH!! “RAE2008 results are in the form of a quality profile for each submission made by an HEI [Higher Education Institution]. We have not produced any ranked lists of single scores for institutions or Units of Assessment, and nor do we intend to.”
  58. 58. 5  livelli  di  qualità  (assolu()  
  59. 59. La  chiave  di  volta:   i  “quality  profiles”  
  60. 60. Dai  livelli  ai  numeri   9 (dal 2011)
  61. 61. La formula Score = Volume × Cost × (9p4 + 3p3 + p2) p4  =  %  prodos  in  classe  4   p3  =  %  prodos  in  classe  3   p2  =  %  prodos  in  classe  2  
  62. 62. Capitolo  5   The  power  of   numbers  
  63. 63. Rankings   •  •  •  •  Fragile  scienPfic  grounds   IncenPve  to  gaming   Raw  data  are  obscured   They  are  not  necessary  to  manage  funding   (see  RAE/REF)   Why,  then?  
  64. 64. Rankings  are  based  on   composite  indicators   Science  or  pseudo-­‐science?  
  65. 65. Aggregators  vs  non-­‐aggregators  (1/3)  
  66. 66. Aggregators  vs  non-­‐aggregators  (2/3)  
  67. 67. Aggregators  vs  non-­‐aggregators  (3/3)   •  Aggregators:    value  in  combining  indicators:  extremely  useful   in  garnering  media  interest  and  hence  the   a[enKon  of  policy  makers   •  Non-­‐aggregators:      key  objecKon  to  aggregaKon:  the  arbitrary   nature  of  the  weighKng  process  by  which  the   variables  are  combined  
  68. 68. Germany   •  “We  look  back  decades  and  people  came  to   German  universiKes;  today  they  go  to  US   universiKes.”   •  The  Exzellenzini(a(ve  (2005):  from  tradiPonal   emphasis  on  egalitarianism  towards   compePPon  and  hierarchical  stra(fica(on  
  69. 69. France   •  The  Shanghai  ranking      “generated  considerable  embarrassment   among  the  French  intelligentsia,  academia  and   government:  the  first  French  higher  educaKon   insKtuKon  in  the  ranking  came  only  in  65th   posiAon,  mostly  behind  American  universiKes   and  a  few  BriKsh  ones”  
  70. 70. Australia   •  The  SJT  and  QS:  at  least  two  Australian   universiPes  among  the  top  100.   •  Opposing  strategic  opPons:   –  fund  a  small  number  of  top-­‐Per  compePPve   universiPes   –  “creaPon  of  a  diverse  set  of  high  performing,   globally-­‐focused  insPtuPons,  each  with  its  own   clear,  dis(nc(ve  mission”.  
  71. 71. Japan   •  “The  government  wants  a  first  class  university   for  internaKonal  presKge  ”   •  “in  order  for  Japanese  HEIs  to  compete   globally,  the  government  will  close  down  some   regional  and  private  universiKes  and  direct   money  to  the  major  universiAes”   •  some  insPtuPons  will  become  teaching  only.    
  72. 72. NUMBER OF UNIVERSITIES Why obsessing about the “top 1%”? OTHER 16,500 UNIVERSITIES PERFORMANCE TOP 1%
  73. 73. Answer:  trickle-­‐down  knowledge!  
  74. 74. E.  Hazelkorn  on  rankings   •  90  or  95%  of  our  students  do  not  a=end  elite   insPtuPons.  Why  are  we  spending  so  much  on  what   people  aren’t  aGending  as  opposed  to  what  they  are   a=ending?   •  EsPmated  yearly  budget  of  €1.5  billion  to  be  ranked   in  the  world’s  top  100.  May  detract  resources  from   pensions,  health,  housing,  ....   •  Are  “elite”  insPtuPons  really  driving  naPonal  or   regional  economic  and  social  development?  
  75. 75. Does  trickle-­‐down  work?     “Governments and universities must stop obsessing about global rankings and the top 1% of the world's 15,000 institutions. Instead of simply rewarding the achievements of elites and flagship institutions, policy needs to focus on the quality of the system-as-a-whole.” There is little evidence that trickle-down works.
  76. 76. Capitolo  6  
  77. 77. Where  are  we?   •  (Even)  Phil  Baty  (Times  Higher  EducaKon)   admits  that  there  are  aspects  of  academic  life   where  rankings  are  of  li=le  value   •  Can  we/you  afford  the  ‘reputaPon  race’?   •  We  will  have  to  live  in  a  world  in  which   extremely  poor  rankings  are  regularly   published  and  used.   What  can  be  done  then?  
  78. 78. What  can  be  done  then?   •  There  is  no  such  thing  as  a  ‘‘best  university’’  in   abstracto.   •  Stop  talking  about  these  ‘‘all  purpose   rankings’’.  They  are  meaningless.   •  Lobby  in  our  own  insPtuPon  so  that  these   rankings  are  never  men(oned  in  insPtuPonal   communicaPon   •  Produce  many  alterna(ve  rankings  that   produce  vastly  different  results.  
  79. 79. Grazie  per  l’aGenzione!