"The future of research evaluation: Trends and prospects in research assessment" at AMJE, on November 2, 2012

340 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
340
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

"The future of research evaluation: Trends and prospects in research assessment" at AMJE, on November 2, 2012

  1. 1. The  future  of  research  evalua/on:  Trends  and  prospects  in  research  assessment Syun  Tu/ya   Na/onal  Ins/tu/on  for  Academic  Degrees  and  University  Evalua/on   土屋俊   大学評価・学位授与機構     Novermber  2,  2012   JAMJE
  2. 2. Outline •  Types  of  “research  assessment”(研究評価の種類)  •  Currently  in  Japan(日本における状況)  •  Problems  recognized(その問題点)  •  Ins/tu/onal  assessment  of  research(研究の機関別 評価)  •  Alterna/ve  philosophies  and  methods(代替的評価)  •  Prepublica/on  review  and  research  assessment   (査読と研究評価)  
  3. 3. Types  of  “research  assessment” •  By  purpose  and  assessment  agent(目的と評価主体)   –  POLISY  MAKING  =>  policy  maker   –  EVECTIVENESS  =>  funder   –  EMPOYMENT/PROMOTION/INCENTIVE  =>  management  –  university,  research   ins/tute  •  By  assessment  unit(評価対象の単位)   –  ins/tu/on,  department,  research  unit   –  project,  researcher   –  “product”  –  ar/cle,  protocol,  database,  dataset,  soware/program  •  By  transparency(客観性)   –  self   –  external(peer/stakeholder)   –  third  party(peer/stakeholder)  •  By  viewpoint(観点)   –  process    -­‐-­‐  “how  many  ar/cles?”  (output)   –  quality        -­‐-­‐  “how  many  cita/ons?”  (outcomes)   –  impact        -­‐-­‐  “how  much  money?,  what  product?”  •  By  /ming(時期)   –  prior   –  interim   –  posterior
  4. 4. Current  research  assessment  prac/ces   in  Japan •  Project  reviews   –  Big  Grant-­‐in-­‐Aid  projects(JSPS  and  others)   –  JST  grant  projects  •  Ins/tu/onal  reviews   –  Na/onal  University  Corpora/on  Evalua/on  by  NIAD-­‐UE  =>  by   third  party  peers  •  Personal  assessment   –  employment/promo/on   –  funding  •  Na/onal  trends   –  JST   –  NISTEP   –  Elsevier  
  5. 5. Problems  of  current  prac/ces •  Peer  review  is  OK,  but  NO  standards,  NO  benchmarks(研究評価はピア レビューで、しかし比較を全然しないのは?)   –  Typically,  Nat  Uni  Corpora/on  Evalua/on(国立大学法人評価は、その典型)  •  “Quan/ta/ve  indicators”   –  Conven/onal:  cita/on,  “Journal  impact  factor”   –  h-­‐index(Hirsch)  =>  researcher-­‐level(produc/vity  and  impact)  =>  group-­‐ level  (but  co-­‐authorship,  fields,  youth  discriminated,  etc,  etc)   –  Altmetrics  =>  ar/cle-­‐level  •  Commercializa/on   –  Thomson  Reuters,  Elsevier(Scival  Spotlight)  •  “World  University  Ranking”   –  ARWU  by  Shanghai  Jiao  Tong  University   –  THE-­‐WUR  by  Times  Higher  Educa/on(Thomson-­‐Reuters)   –  QS-­‐WUR  by  Quacquarelli  Symonds(Scopus)  and  more  
  6. 6. What  if  RAE/REF  in  Japan? •  REF:  Research  Excellence  Framework(UK)   –  For  selec/ve  alloca/on  of  research  funding,  accountability,   benchmarking  informa/on  and  reputa/onal  yards/ck   –  Assess  the  quality  of  research  in  UK  HEI     –  By  expert  review,  36  units(4  main  panel)   –  5  levels(4  *  to  unclassified)   –  outputs(65%)  in  terms  of  “originality,  significance  and   regour”,  impact(20%)  in  terms  of  “reach”  and  “significance”  •  Any  use?   –  Close  to  “research”  category  of  Nat  U  C  Assessment   –  bibliometric  approach?  =>  “cita/on  informa/on  is  not  sufficiently   robust  to  be  used  formulaically  or  as  a  primary  indicator  of  quality,   but  there  is  considerable  scope  for  it  to  inform  and  enhance  the   process  of  expert  review.”  
  7. 7. Altmetrics •  Usually  measuring  ar/cle  level  impact     –  views,  downloads   –  men/ons  in  social  media   –  correla/on  with  cita/on  counts  •  I.e.  post-­‐publica/on  impact/pre-­‐cita/on   –  Light  weight  review  by  PLoS  ONE  =>  making  it   sustainable,  though   –  Scien/fic  Reports  review  process  growing  heavier  •  Any  use  for  ar/cle-­‐level  measure?
  8. 8. eLife’s  new  approach “Media  policy”  Oct  29   1.  Presen/ng  and  discussing  the  work  prior  to  publica/on   a.  Prior  to  publica/on  authors  are  encouraged  to  present  their  findings  to   their  peers   b.  When  there  is  media  interest...,  we  encourage  the  author  to  deposit  the   accepted  version  of  the  manuscript  in  an  open-­‐access  repository,     c.   ...include  a  reference  to  eLife  and/or  elifesciences.org.     “NO  Ingelfinger  rule!      NO  embargo!”   2.  Promo/on  of  published  content   a.  Every  published  eLife  paper  will  have  a  short,  plain-­‐language  summary  (the   eLife  Digest).   b.  Papers  in  eLife  will  be  promoted  to  the  media  and  to  interested  readers  on   the  day  of  publica/on.     c.   We  will  issue  press  releases  for  some  papers  on  the  day  of  publica/on.  

×