"The future of research evaluation: Trends and prospects in research assessment" at AMJE, on November 2, 2012
The future of research evalua/on: Trends and prospects in research assessment Syun Tu/ya Na/onal Ins/tu/on for Academic Degrees and University Evalua/on 土屋俊 大学評価・学位授与機構 Novermber 2, 2012 JAMJE
Outline • Types of “research assessment”(研究評価の種類) • Currently in Japan(日本における状況) • Problems recognized(その問題点) • Ins/tu/onal assessment of research(研究の機関別 評価) • Alterna/ve philosophies and methods(代替的評価) • Prepublica/on review and research assessment (査読と研究評価)
Types of “research assessment” • By purpose and assessment agent(目的と評価主体) – POLISY MAKING => policy maker – EVECTIVENESS => funder – EMPOYMENT/PROMOTION/INCENTIVE => management – university, research ins/tute • By assessment unit(評価対象の単位) – ins/tu/on, department, research unit – project, researcher – “product” – ar/cle, protocol, database, dataset, soware/program • By transparency(客観性) – self – external(peer/stakeholder) – third party(peer/stakeholder) • By viewpoint(観点) – process -‐-‐ “how many ar/cles?” (output) – quality -‐-‐ “how many cita/ons?” (outcomes) – impact -‐-‐ “how much money?, what product?” • By /ming(時期) – prior – interim – posterior
Current research assessment prac/ces in Japan • Project reviews – Big Grant-‐in-‐Aid projects(JSPS and others) – JST grant projects • Ins/tu/onal reviews – Na/onal University Corpora/on Evalua/on by NIAD-‐UE => by third party peers • Personal assessment – employment/promo/on – funding • Na/onal trends – JST – NISTEP – Elsevier
Problems of current prac/ces • Peer review is OK, but NO standards, NO benchmarks(研究評価はピア レビューで、しかし比較を全然しないのは?) – Typically, Nat Uni Corpora/on Evalua/on(国立大学法人評価は、その典型) • “Quan/ta/ve indicators” – Conven/onal: cita/on, “Journal impact factor” – h-‐index(Hirsch) => researcher-‐level(produc/vity and impact) => group-‐ level (but co-‐authorship, ﬁelds, youth discriminated, etc, etc) – Altmetrics => ar/cle-‐level • Commercializa/on – Thomson Reuters, Elsevier(Scival Spotlight) • “World University Ranking” – ARWU by Shanghai Jiao Tong University – THE-‐WUR by Times Higher Educa/on(Thomson-‐Reuters) – QS-‐WUR by Quacquarelli Symonds(Scopus) and more
What if RAE/REF in Japan? • REF: Research Excellence Framework(UK) – For selec/ve alloca/on of research funding, accountability, benchmarking informa/on and reputa/onal yards/ck – Assess the quality of research in UK HEI – By expert review, 36 units(4 main panel) – 5 levels(4 * to unclassiﬁed) – outputs(65%) in terms of “originality, signiﬁcance and regour”, impact(20%) in terms of “reach” and “signiﬁcance” • Any use? – Close to “research” category of Nat U C Assessment – bibliometric approach? => “cita/on informa/on is not suﬃciently robust to be used formulaically or as a primary indicator of quality, but there is considerable scope for it to inform and enhance the process of expert review.”
Altmetrics • Usually measuring ar/cle level impact – views, downloads – men/ons in social media – correla/on with cita/on counts • I.e. post-‐publica/on impact/pre-‐cita/on – Light weight review by PLoS ONE => making it sustainable, though – Scien/ﬁc Reports review process growing heavier • Any use for ar/cle-‐level measure?
eLife’s new approach “Media policy” Oct 29 1. Presen/ng and discussing the work prior to publica/on a. Prior to publica/on authors are encouraged to present their ﬁndings to their peers b. When there is media interest..., we encourage the author to deposit the accepted version of the manuscript in an open-‐access repository, c. ...include a reference to eLife and/or elifesciences.org. “NO Ingelﬁnger rule! NO embargo!” 2. Promo/on of published content a. Every published eLife paper will have a short, plain-‐language summary (the eLife Digest). b. Papers in eLife will be promoted to the media and to interested readers on the day of publica/on. c. We will issue press releases for some papers on the day of publica/on.