Evaluation and Ranking of National Research Institutes Project Group of National Research Institute Ranking Institute of P...
Motivation <ul><li>CAS, the largest independent research institutions in China , with almost  100 research institutions in...
Evaluation and Ranking of National Research Institutes <ul><li>CAS needs to compare its performance with those of leading ...
What is a national research institute? <ul><li>They are </li></ul><ul><li>( 1 ) Large multi-discipline research institutes...
Evaluation and Ranking of National Research Institutes <ul><li>Research of a national research institute often covers many...
What are their missions?  <ul><li>R&D Development need of a country  </li></ul><ul><li>This part is difficult to compare a...
Evaluation and Ranking of National Research Institutes <ul><li>Based on Soft System Methodology, we developed a methodolog...
Hierarchical Structure of Indicator <ul><li>Output or effectiveness were divided into several levels </li></ul><ul><li>Exa...
<ul><li>Approach </li></ul><ul><li>Samples </li></ul><ul><li>Indicators </li></ul><ul><li>Evaluation Method </li></ul><ul>...
Approach <ul><ul><li>Subject evaluation first -then cross  subject evaluation </li></ul></ul><ul><ul><li>Theoretical base:...
Approach Overall  Subject 1 … Subject N E1 output E2 efficiency E3  E2  E1  Subjects Ranking and then Overall Ranking SSM ...
Ranking of RIs Overall performance Functions analysis Single  discipline Multi- disciplines Basic research Applied Researc...
Sample Selection -  basic research  <ul><li>Generating new knowledge </li></ul><ul><li>Undertaking state projects or progr...
Sample- Scale difference
Sample-  subjects distribution   USDA 50% 4% 0% 2% 14% 2% 1% 0% 5% 0% 0% 1% 1% 1% 19% 1 . BiSci 2 . Chem 3 . Comp 4 . Eng ...
Indicators 2000 European Commission Office   ( EC ) Knowledge Science and Innovation ( KSI ) 2003 Economic and Social Comm...
<ul><li>Characteristics  : </li></ul><ul><ul><li>Statistial-oriented indicators : </li></ul></ul><ul><ul><li>(CSY 、 CSTY 、...
<ul><li>Indicator System for Research Institutes  </li></ul><ul><li>Only partial functions </li></ul><ul><li>Too many </li...
Indicators system      3Es theory Inputs Outputs internal operational system External  environment E2=Efficiency E1=Effic...
Why needs system analysis <ul><li>How to build a complete 3E indicator system?  </li></ul><ul><li>Complete indicator syste...
SSM and 3Es 2.Express  messy system 3.Build RD of  relevant purposeful  activity system 1.Complex situation  inspect   6.D...
Evaluation Framework: System monitored and controlled by 3Es <ul><li>Based on 3E’ </li></ul><ul><ul><li>Efficacy </li></ul...
Basic research evaluation in CAS <ul><li>Clients: </li></ul><ul><ul><li>CAS </li></ul></ul><ul><ul><li>REC (PO) </li></ul>...
A state-owned system to improve the originality, significance, reputation and sustainability of CAS basic research in the ...
A CAS-owned system to enrich the world’s knowledge in a particular scientific domain with original and significant researc...
Monitor Control Criteria for Efficacy Efficiency Operational Control Strategic Control 1. Identify potentially significant...
Example: Areas of possible research in related disciplines Areas of significance identified RD1  A system to identify rese...
CM4  necessary resources obtaining system 4.1 Know possible research can be conducted <ul><li>4.2  Appreciate to extra res...
Primary extracted efficacy indicators
* + E3 High quality graduates * * + + + E1 Graduate enrollment + + E1 Joint projects + E1 Joint-lab + E1 Exchange graduate...
E3 Plan 1 : E1 : Output (1/3) <ul><li>1.1 SCI total (1/2) </li></ul><ul><li>1.2 Hi-publication </li></ul><ul><li>Top0.1% P...
E3 Plan 2 E1 : Output (1/3) <ul><li>1.1 SCI Total (1/2) </li></ul><ul><li>1.2 Hi-publication  </li></ul><ul><li>Top0.1% Pu...
Random Sensitivity Analysis      E3(Max)    Change (5%)   Change (10%)   NO. 机构名称 Plan1 Plan 2 Plan 1 Plan 2 Plan 1 Plan 2...
Weights and Aggregation  <ul><li>Each of the subjects has the same importance </li></ul><ul><li>Using weighted average at ...
Aggregation techniques <ul><li>Subject Ranking:  For each ESI21 subject , compute E1 and E3. </li></ul><ul><li>Institute R...
Aggregation techniques <ul><li>Method One:  </li></ul><ul><li>Compute the total of E3 from all those of subjects </li></ul...
Aggregation techniques <ul><li>Based on each discipline,  </li></ul><ul><li>Normalization  E1-E3  based on average value <...
Case study - aggregation method based on rank-added method 36 8 13 27 34 5 23 23 10 13 7 21 43 15 263 Unit28 6 37 25 11 31...
Case study - aggregation method based on score-added method 10.26  6.78  6.25  13.74  11.05  5.25  5.89  6.31  5.40  9.68 ...
Case study Unit 1 Unit 2 Unit 7 Unit 10 Unit 6
Limitations <ul><li>Some institutes’ main outputs are not publications  </li></ul><ul><li>Some institutes concentrated on ...
Thanks
Upcoming SlideShare
Loading in …5
×

Paper 5: Rankings of International Research Institutes (Liu, Wenbin)

1,414 views
1,208 views

Published on

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,414
On SlideShare
0
From Embeds
0
Number of Embeds
9
Actions
Shares
0
Downloads
14
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Paper 5: Rankings of International Research Institutes (Liu, Wenbin)

  1. 1. Evaluation and Ranking of National Research Institutes Project Group of National Research Institute Ranking Institute of Policy and Management, Chinese Academy of Science Presented by Prof. WB Liu Kent Business School, University of Kent, UK 2010-07-01 , Canterbury, UK
  2. 2. Motivation <ul><li>CAS, the largest independent research institutions in China , with almost 100 research institutions involving all the disciplines of natural sciences, about 30,000 permanent research staff </li></ul><ul><li>Quantitative expansion moved to qualitative improvement </li></ul><ul><li>Crucial to enhance competitiveness and sustainability in international level </li></ul><ul><li>Need to know the gap comparing with relevant institutions in the word, relative position </li></ul><ul><ul><li>Strength and weakness </li></ul></ul><ul><ul><li>How to improve </li></ul></ul><ul><li>Evaluation and ranking </li></ul>
  3. 3. Evaluation and Ranking of National Research Institutes <ul><li>CAS needs to compare its performance with those of leading national research institutes </li></ul><ul><li>There are international evaluation and ranking reports that contain R&D evaluations for a country or university </li></ul><ul><li>But there exist no reports tailor-made for national research institutes </li></ul><ul><li>Also these reports cannot be used to diagnose weakness of CAS </li></ul>
  4. 4. What is a national research institute? <ul><li>They are </li></ul><ul><li>( 1 ) Large multi-discipline research institutes adm and funded by a country </li></ul><ul><li>( 2 ) Research institutes adm. and funded by ministries or public bodies </li></ul><ul><li>( 3 ) Research institutes adm. and funded by local government or mixed </li></ul>
  5. 5. Evaluation and Ranking of National Research Institutes <ul><li>Research of a national research institute often covers many different subjects’ areas and has multiple-purposes for basic research, applied research, and R&D. </li></ul><ul><li>The key issue is how to evaluate its overall capability and performance. </li></ul><ul><li>In 2006 this study first started in the Evaluation Centre of CAS, IPM, CAS. Then CAS Library joined. </li></ul><ul><li>A project group was formed –key members including Li X.X, Wei Meng, Ms Liu, Z.Y, Ms Fang Xu (Kent), Mr Y.G Yang Prof. WB Liu was in charge of methodology development </li></ul><ul><li>The project was finished in 2009. </li></ul>
  6. 6. What are their missions? <ul><li>R&D Development need of a country </li></ul><ul><li>This part is difficult to compare and evaluate </li></ul><ul><li>Basic and applied scientific research </li></ul><ul><li>For this part, it is possible to compare their performance, at least by subject. We developed 3E measurement for this purpose: </li></ul><ul><li>E1 – Research Efficacy </li></ul><ul><li>E2 – Research Efficiency </li></ul><ul><li>E3 - Research Effectiveness </li></ul>
  7. 7. Evaluation and Ranking of National Research Institutes <ul><li>Based on Soft System Methodology, we developed a methodology in building indicators system to measure efficacy, efficiency; effectiveness of a national research institute. </li></ul><ul><li>We have built the theoretical 3E indicators system for the overall capability and performance of national research institutes, based on the SSM models of basic research, applied research </li></ul><ul><li>We have developed evaluation methods to aggregate the data in order to produce ranks for the institutes 。 </li></ul>
  8. 8. Hierarchical Structure of Indicator <ul><li>Output or effectiveness were divided into several levels </li></ul><ul><li>Example: High quality Pub has Top0.1%,Top1%, Top10%, and SCI, with decreasing weights </li></ul><ul><li>SCI editorship is divided into 1% 、 Top5%,Top10%, </li></ul><ul><li>Awards were similarly divided. – cross subjects, subjects, sub-subjects </li></ul>TOP10% Pub TOP0 .1% Pub TOP1% Pub
  9. 9. <ul><li>Approach </li></ul><ul><li>Samples </li></ul><ul><li>Indicators </li></ul><ul><li>Evaluation Method </li></ul><ul><li>Case </li></ul><ul><li>Further work – data reliability and methodology robust </li></ul><ul><li>International Discussion Group </li></ul>Evaluation and Ranking of National Research Institutes
  10. 10. Approach <ul><ul><li>Subject evaluation first -then cross subject evaluation </li></ul></ul><ul><ul><li>Theoretical base: SSM and 3E indicator theory </li></ul></ul><ul><ul><ul><li>Research Output(E1), </li></ul></ul></ul><ul><ul><ul><li>Reproach Efficiency (E2), </li></ul></ul></ul><ul><ul><ul><li>Research Effectiveness (E3) </li></ul></ul></ul><ul><ul><li>Overall by aggregating sub-rankings of subjects </li></ul></ul>
  11. 11. Approach Overall Subject 1 … Subject N E1 output E2 efficiency E3 E2 E1 Subjects Ranking and then Overall Ranking SSM ANALYSIS E3 effectiveness
  12. 12. Ranking of RIs Overall performance Functions analysis Single discipline Multi- disciplines Basic research Applied Research indicators Samples <ul><ul><li>Sample comparison, feedback and discussion </li></ul></ul><ul><ul><li>Define and select indicators, build indicators system </li></ul></ul><ul><ul><li>Aggregating, preliminary calculation, further adjusting </li></ul></ul>1 2 4 3 indicators Samples
  13. 13. Sample Selection - basic research <ul><li>Generating new knowledge </li></ul><ul><li>Undertaking state projects or programme, contribute to high-tech development, finally, social and economic development </li></ul>Total publications in top 100 Total citations in top 100 Citations per article in top 100 Discipline Based on ESI ( Essential Science Indicators), 20 disciplines, also EI,.. Selected 89 organization world-wide Samples pool Selected samples
  14. 14. Sample- Scale difference
  15. 15. Sample- subjects distribution USDA 50% 4% 0% 2% 14% 2% 1% 0% 5% 0% 0% 1% 1% 1% 19% 1 . BiSci 2 . Chem 3 . Comp 4 . Eng 5 . Envi 6 .地学 7 .材料 8 .数学 9 .医学 10 .神经 11 .物理 12 .社科 13 .多学科 14 .免疫 15 .农业 CAS 11% 27% 2% 6% 3% 9% 10% 2% 2% 1% 24% 0% 2% 0% 1% 1 . BiSci 2 . Chem 3 . Comp 4 . Eng 5 . Envi 6 .地学 7 .材料 8 .数学 9 .医学 10 .神经 11 .物理 12 .社科 13 .多学科 14 .免疫 15 .农业
  16. 16. Indicators 2000 European Commission Office ( EC ) Knowledge Science and Innovation ( KSI ) 2003 Economic and Social Commission for Western Asia ( ESCWA ) New Science and Technology Indicators (NST) 2002-2006 National Science Board ( NSB ) Science and Engineering Indicators-US (SEI) 2003 World Economic Forum ( WEF ) Global competition indicators ( GCI ) 2003 ( OECD ) Main Science and Technology Indicators ( MSTI ) 2004 Lausanne ( IMD ) World Competitiveness Yearbook ( WCY ) 2004-2006 1991-2001 1000-2004 2006 WuHan University Research institute of Management Science, GuangDong, Graduates Development Centre in MOE, Evaluation and Ranking of Chinese University (CUE) 2005 WuHan Univeristy World University Research competitiveness Index (WUSRCI) 2006 CAS CAS Innovation Index (CASCCI) 2004 MOST, China Chinese Science Technology Indicator (CSTY) 2005 MOST, China Chinese Science and Technology Year Book (CSSTY) 2003—2005 Bureau of statistics, China Chinese Statistic Year Book (CSY) year Organization Title
  17. 17. <ul><li>Characteristics : </li></ul><ul><ul><li>Statistial-oriented indicators : </li></ul></ul><ul><ul><li>(CSY 、 CSTY 、 SCTY, OECD(MSTI) ) </li></ul></ul><ul><ul><ul><li>Present overall national or local pictures in terms of R&D activities </li></ul></ul></ul><ul><ul><ul><li>Linkage with organizational function is weak, hard to use to evaluate research institutions directly </li></ul></ul></ul><ul><ul><li>Satisfy specific assessment objective : </li></ul></ul><ul><ul><li>(University ranking) </li></ul></ul><ul><ul><ul><li>As least as possible to achieve evaluation objective </li></ul></ul></ul><ul><ul><ul><li>More or less, lack of systemic inter-functional analysis of research institutions </li></ul></ul></ul><ul><ul><li>Indicator definition and selection : expert experiences 、 questionnaire </li></ul></ul>Indicators comparison
  18. 18. <ul><li>Indicator System for Research Institutes </li></ul><ul><li>Only partial functions </li></ul><ul><li>Too many </li></ul><ul><li>No explicit relationship to internal functions </li></ul><ul><li>Mixed use of different type indicator </li></ul><ul><li>Scores always from weighed averaging </li></ul><ul><li>…… </li></ul>
  19. 19. Indicators system  3Es theory Inputs Outputs internal operational system External environment E2=Efficiency E1=Efficacy E3=Effectiveness
  20. 20. Why needs system analysis <ul><li>How to build a complete 3E indicator system? </li></ul><ul><li>Complete indicator systems can only be based on good understanding of internal processes </li></ul><ul><li>A good SSM system analysis will not only bring good understanding of internal processes but also structured 3E indicator data-bases </li></ul><ul><li>Good combinations of existing techniques </li></ul>
  21. 21. SSM and 3Es 2.Express messy system 3.Build RD of relevant purposeful activity system 1.Complex situation inspect 6.Debate with systematically desirable and culture feasible 7.Take into actions to improve 5.Compare CM with 2 4.Formulate CM Real world system things RD1 RD2 RD3 <ul><li>The spirit of SSM is that any system can be analysed by answering three questions as: </li></ul><ul><li>What to do (P) - Efficacy relate, measures the outputs of the system </li></ul><ul><li>How to do (Q) - Efficiency relate, evaluates whether minimum resources are used. </li></ul><ul><li>Why to do (R) - Effectiveness relate, assesses whether the system outputs are meaningful or useful to higher level (wider) system. </li></ul>
  22. 22. Evaluation Framework: System monitored and controlled by 3Es <ul><li>Based on 3E’ </li></ul><ul><ul><li>Efficacy </li></ul></ul><ul><ul><ul><li>Outputs of a system, or value-add </li></ul></ul></ul><ul><ul><li>Efficiency </li></ul></ul><ul><ul><ul><li>Outputs/resources, or value-add/resources </li></ul></ul></ul><ul><ul><li>Effectiveness </li></ul></ul><ul><ul><ul><li>Desirable impact of Efficacy to aims of higher systems </li></ul></ul></ul><ul><ul><li>Example: </li></ul></ul><ul><ul><ul><li>A production-line can produce goods efficiently but make no profits. So effectiveness is 0 for the company . Or it may make profits but with 0 effectiveness for the state. </li></ul></ul></ul><ul><li>4E’ or 5E’ … </li></ul>
  23. 23. Basic research evaluation in CAS <ul><li>Clients: </li></ul><ul><ul><li>CAS </li></ul></ul><ul><ul><li>REC (PO) </li></ul></ul><ul><li>Problem Owner (PO) </li></ul><ul><ul><li>institutes </li></ul></ul>
  24. 24. A state-owned system to improve the originality, significance, reputation and sustainability of CAS basic research in the natural sciences by developing the research capabilities and infrastructure, and by improving resources utilisation of CAS in order to benefit Chinese social and economic development and to enrich human knowledge. C: World scientific community A: CAS headquarters and institutes T: Improvement in significance, reputation and sustainability W: Goal of CAS is improvement in research outcomes O: Chinese state council E: Economic and cultural constraints, current research infrastructure (R): Institutes, scientists, money, equipment Basic research in CAS — CAS as a whole
  25. 25. A CAS-owned system to enrich the world’s knowledge in a particular scientific domain with original and significant research by identifying potential areas of discovery, developing the capabilities to undertake appropriate research, carrying out the research and disseminating it through prestigious channels in order to enhance the reputation, significance and sustainability of CAS and the Institute. C: World scientific community in discipline, CAS (reputation) A: Institute personnel T: Generating significant new knowledge W: Role of an Institute within CAS is to generate significant new knowledge O: CAS E: CAS procedures, funding (R): Scientists, other staff, funding, equipment Basic research in CAS — Generic Institute
  26. 26. Monitor Control Criteria for Efficacy Efficiency Operational Control Strategic Control 1. Identify potentially significant areas of discovery 2. Appreciate current resources, capabilities and projects 3. Decide which opportunities to pursue 4. Obtain necessary resources 5. Conduct research 6. Disseminate results through prestigious channels 7. Contribute to the reputation, resources and sustainability Generic Institute Conceptual Model Monitor Control Criteria for Effectiveness
  27. 27. Example: Areas of possible research in related disciplines Areas of significance identified RD1 A system to identify research opportunities for institute X that are significant and original bearing in mind the resources and capabilities needed, by effective external scanning and by improving internal discussion and communication in order to contribute to decide which opportunities to pursue which contribute to enhance reputation, resources and sustainability of the Institute. Resources: Funding, hardware investments, staff Research levels Research reputation CAS procedure 1.1 Scan the external environment for research opportunities 1.2 consider current resources and capabilities 1.3 Encourage discussion and communication within and among institutes 1.4 decide on those which are potentially possible 1.5 specify extra requirements for potentially possibilities E1: does it produce possible further research topics? E2: E1/resources (eg time, funds, people) E3: do we get significant research done which contributes research reputation and resources enhancement? CM1. Identify potentially significant areas of discovery
  28. 28. CM4 necessary resources obtaining system 4.1 Know possible research can be conducted <ul><li>4.2 Appreciate to extra resources that will be needed </li></ul><ul><li>Human resources </li></ul><ul><li>Equipment </li></ul><ul><li>Competence </li></ul><ul><li>funds </li></ul>4.3 Decide how that can be obtained by 4.5 Make appreciated application and proposal 4.6 Check that necessary resources are obtained 4.4 Provide institutional support 4.3.1 Human resources: 4.3.2 Equipment: by applying from headquarter based on current research capability and further development 4.3.3 Funding: by applying for external research projects, or internal projects 4.3.4 Others: by applying for extra investment to improve research infrastructure, such as building, decoration
  29. 29. Primary extracted efficacy indicators
  30. 30. * + E3 High quality graduates * * + + + E1 Graduate enrollment + + E1 Joint projects + E1 Joint-lab + E1 Exchange graduates E1 Megic-science infrastructure others * E2 Rate of excellent scientist E2 Invited talks per staff * E2 Awards per staff * * + * E3 Excellent scientists E3 Invited talks * * * + * * E3 Awards impact E2 Profit of owned company * E1 Vendibility of owned company * * * E2 Internal funding: external funding * + + + + E1 Rate of capital asserts + + E1 Competitive research funding + + E2 Research funding per staff + + + + + + + + E1 Research expenditure Financial measure + * * * E2 Award Patents per staff + * * * E3 Commercialized patens E3 Core patens awards * * * + * + + * * E1 Number of patents awards * + * * * * E1 Number of patents application patents * E2 SCI , SSCI papers/funding * E2 Percentage of Top papers * E2 Percentage of cooperation papers * * E2 Top 0.1%-10% paper per staff * * * E2 SCI , SSCI paper per staff E3 Top papers based on megic-science infrastructure + * E3 Top10% papers + * E3 Top 1% papers + * * E3 Top 0.1% papers + + + E2 SCI , SSCI cited times per papers + + + + + E3 SCI , SSCI total citations + + + + E1 SCI , SSCI cooperation papers + + * + + + + + E1 SCI , SSCI papers Biblo- Metrics KSI NST SEI GCI MSTI WCY FUCY CUE WUSRCI CASCCI CSTY CSSTY CSY 3Es Indicators Comparisons 3E indicators
  31. 31. E3 Plan 1 : E1 : Output (1/3) <ul><li>1.1 SCI total (1/2) </li></ul><ul><li>1.2 Hi-publication </li></ul><ul><li>Top0.1% Pub (1/6) </li></ul><ul><li>Top1% Pub (1/6) </li></ul><ul><li>Top10% Pub (1/6) </li></ul>E2 : Efficiency (1/3) 2.1 SCI/head (1/2) 2.2 Hi-pub/head (1/2) E3: Effectiveness (1/3) <ul><li>3.1 Citation </li></ul><ul><li>Total (1/6) </li></ul><ul><li>Average (1/6) </li></ul><ul><li>3.2 Position </li></ul><ul><li>Post in Inter-Oragn ( 1/6 ) </li></ul><ul><li>Editorship </li></ul><ul><li>TOP1% editor (1/18) </li></ul><ul><li>TOP5% editor (1/18) </li></ul><ul><li>TOP10% editor (1/18) </li></ul><ul><li>3.3 Award cross subject (1/6) </li></ul><ul><li>subject +.. (1/6) </li></ul>
  32. 32. E3 Plan 2 E1 : Output (1/3) <ul><li>1.1 SCI Total (1/2) </li></ul><ul><li>1.2 Hi-publication </li></ul><ul><li>Top0.1% Pub (1/6) </li></ul><ul><li>Top1% Pub (1/6) </li></ul><ul><li>Top10% Pub (1/6) </li></ul>E2 : Efficieny(1/3) 2.1 SCI/head (1/2) 2.2 Hi-pub/head (1/2) E3 : (1/3) <ul><li>3.1 Citation </li></ul><ul><li>Total (1/6) </li></ul><ul><li>Average (1/6) </li></ul><ul><li>3.2 Editorship </li></ul><ul><li>TOP1% editor (1/9) </li></ul><ul><li>TOP5% editor (1/9) </li></ul><ul><li>TOP10% editor (1/9) </li></ul><ul><li>3.3 honor Awards – cross (1/12) </li></ul><ul><li>subject+subject (1/12) </li></ul><ul><li>Post in Inter Organ ( 1/6 ) </li></ul>
  33. 33. Random Sensitivity Analysis     E3(Max)   Change (5%)   Change (10%)   NO. 机构名称 Plan1 Plan 2 Plan 1 Plan 2 Plan 1 Plan 2   Average Change     0.0978 0.0477 0.1363 0.0666 1 美国农业部 0.877238 0.8772 0 0 0 0 2 法国农业科学研究院 0.192529 0.2536 0.0257 0.0097 0.0247 0.0093 3 西班牙科学研究委员会 0.150366 0.1796 0.0317 0.0132 0.0324 0.0135 4 加拿大农业及农业食 0.110103 0.1493 0.0302 0.0111 0.0247 0.0091 5 日本农林水产省 0.050875 0.0518 0.0490 0.0240 0.0829 0.0407 6 澳大利亚联邦科学与工业研究组织 0.112674 0.1512 0.0554 0.0206 0.0590 0.0220 7 中国科学院 0.116528 0.1735 0.0235 0.0078 0.0482 0.0161 8 印度农业研究理事会 0.021313 0.0213 0.1763 0.0881 0.4217 0.2108 10 英国研究理事会 0.111147 0.1403 0.0426 0.0168 0.0572 0.0226 11 印度科学与工业研究理 0.05148 0.0550 0.1300 0.0607 0.1448 0.0676
  34. 34. Weights and Aggregation <ul><li>Each of the subjects has the same importance </li></ul><ul><li>Using weighted average at all levels inside a subject. All the indicators at the same level have the same weights </li></ul><ul><li>Thus those with a fuller arrange of research subjects and good quality research will perform better </li></ul>
  35. 35. Aggregation techniques <ul><li>Subject Ranking: For each ESI21 subject , compute E1 and E3. </li></ul><ul><li>Institute Ranking: Based on these, compute the total scores or ranks of E1 and E3 to rank institute (in terms of E1 and E3) </li></ul><ul><li>Difficult to obtain E2 data. </li></ul>
  36. 36. Aggregation techniques <ul><li>Method One: </li></ul><ul><li>Compute the total of E3 from all those of subjects </li></ul><ul><li>Method Two: </li></ul><ul><li>Calculate total ranks of E1 or E3 </li></ul><ul><li>Method Three: </li></ul><ul><li>Count number of gold, silver, and brown </li></ul>
  37. 37. Aggregation techniques <ul><li>Based on each discipline, </li></ul><ul><li>Normalization E1-E3 based on average value </li></ul><ul><li>Each discipline are equally important </li></ul><ul><li>Methods: </li></ul><ul><ul><li>Rank-added method </li></ul></ul><ul><ul><li>Score-added method </li></ul></ul>
  38. 38. Case study - aggregation method based on rank-added method 36 8 13 27 34 5 23 23 10 13 7 21 43 15 263 Unit28 6 37 25 11 31 12 29 4 29 31 36 9 1 14 261 Unit20 23 12 10 33 23 13 2 21 5 12 28 40 37 13 259 Unit17 17 25 29 24 20 25 17 13 16 11 23 20 7 12 247 Unit13 8 31 11 14 15 33 32 12 31 19 27 6 2 11 241 Unit15 14 15 8 12 7 16 24 31 26 3 16 13 31 10 216 Unit9 1 27 2 1 9 24 46 15 34 8 20 1 19 9 207 Unit5 19 14 12 8 24 10 12 17 8 18 19 15 28 8 204 Unit10 12 11 18 16 25 9 14 9 22 14 9 12 6 7 177 Unit8 9 16 19 9 10 11 13 27 7 9 10 16 17 6 173 Unit6 27 7 23 17 1 8 10 1 2 20 8 18 20 5 162 Unit3 7 1 4 20 5 3 6 19 3 6 4 14 29 4 121 Unit4 13 5 7 22 3 2 8 8 4 5 3 11 8 3 99 Unit7 10 2 1 7 6 6 1 6 15 4 1 3 18 2 80 Unit1 5 4 3 4 2 4 3 5 1 2 2 4 14 1 53 Unit2 Soci Phys Multi- Med math Mate Geo Envi Eng. Comp Che BioS Agr. Rank Score  
  39. 39. Case study - aggregation method based on score-added method 10.26 6.78 6.25 13.74 11.05 5.25 5.89 6.31 5.40 9.68 7.90 5.49 8.78   MAX 0.31 1.68 1.24 0.14 0.40 1.26 4.58 1.05 2.40 1.08 0.25 0.14 0.13 14 14.6666 Unit17 1.06 0.06 1.24 0.84 0.89 0.09 0.20 1.56 0.32 0.73 0.32 2.78 5.18 13 15.2697 Unit15 1.03 1.38 0.66 1.34 1.29 1.33 1.25 0.47 2.11 1.68 1.54 0.99 0.49 12 15.5748 Unit6 0.15 0.07 0.01 0.04 4.16 0.00 0.02 0.05 1.55 9.68 0.04 0.12 0.00 11 15.8998 Unit54 0.74 1.78 0.68 0.72 0.28 1.44 1.15 1.88 0.72 0.99 1.55 1.13 2.87 10 15.926 Unit8 2.78 0.08 2.62 4.97 0.84 0.24 0.04 0.13 0.14 1.86 0.66 2.29 0.32 9 16.9447 Unit11 0.02 2.40 3.04 2.50 0.15 0.69 0.27 0.15 2.01 0.90 0.77 3.96 0.11 8 16.965 Unit18 1.32 0.04 0.41 1.13 0.09 1.30 0.24 2.80 0.34 0.26 0.12 1.83 8.78 7 18.657 Unit20 0.25 2.74 0.42 0.58 11.05 1.82 1.63 6.31 4.97 0.60 1.77 0.93 0.39 6 33.4436 Unit3 1.18 6.78 3.08 0.48 2.37 4.54 2.92 1.16 4.56 2.62 4.09 1.01 0.22 5 35.0232 Unit4 0.72 4.00 2.36 0.42 7.58 5.25 2.03 1.89 3.81 2.66 4.15 1.19 1.68 4 37.7177 Unit7 10.26 0.08 5.77 13.74 1.38 0.36 0.05 1.42 0.24 1.73 0.68 5.49 0.39 3 41.5793 Unit5 0.97 6.09 6.25 1.90 2.15 2.92 5.89 2.64 1.50 3.18 7.90 3.56 0.48 2 45.4129 Unit1 1.67 4.17 4.32 2.69 9.45 4.48 4.20 2.69 5.40 4.12 4.65 3.52 0.81 1 52.1937 Unit2 Soci Phys Multi- Med math Mate Geo Envi Eng. Comp Che BioS Agr. Rank Score  
  40. 40. Case study Unit 1 Unit 2 Unit 7 Unit 10 Unit 6
  41. 41. Limitations <ul><li>Some institutes’ main outputs are not publications </li></ul><ul><li>Some institutes concentrated on a few subjects so have lower total </li></ul><ul><li>Although indicators are developed from SSM, but not available </li></ul><ul><li>In applied research, many data are not easy to obtain </li></ul>
  42. 42. Thanks

×