Paper 5: Study of the Model and Methodology for Institute Evaluation (Yang)


Published on

Published in: Technology, Business
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • 07/05/10
  • 07/05/10
  • 07/05/10
  • 07/05/10 there are over one hundred institutions directly under CAS, including……
  • 07/05/10
  • 07/05/10
  • 07/05/10
  • 07/05/10
  • 07/05/10
  • 07/05/10
  • 07/05/10
  • 07/05/10
  • 07/05/10
  • Paper 5: Study of the Model and Methodology for Institute Evaluation (Yang)

    1. 1. A Study of the Model and Methodology for Institute Evaluation in CAS Guoliang YANG, Xiaoxuan LI Institute of Policy and Management , CAS Management Innovation and Evaluation Research Center, CAS
    2. 2. Contents <ul><li>Brief introduction of CAS and Evaluation Practice of institutes in CAS </li></ul><ul><li>Three-hierarchy Evaluation Model </li></ul><ul><li>Conclusions and Discussions </li></ul>
    3. 3. 1 Brief Introduction of CAS and Institute Evaluation Practice in CAS
    4. 4. Brief introduction of CAS <ul><li>Chinese Academy of Sciences (CAS) was founded in Beijing on November 1, 1949. Consisting of the Academic Divisions and various subordinate institutions, it is the lead national academic institution in natural sciences, a major advisory body to the government on science and technology related issues, and a national comprehensive research and development center in natural sciences and high technology in China. </li></ul><ul><li>The objective of CAS is to develop the base for scientific research, for training scientific talent and for incubating high-tech industries in China; to become a national scientific think tank and to evolve into a national research institution that boasts “first-class achievements, first-class efficiency, first-class management and first-class talent.” </li></ul>
    5. 5. CAS Academic Divisions CAS Headquarters Main Infrastructure - 17 Large-scale Scientific Research Facilities - 7 National Labs - 5 Field Stations Networks - 36 National Engineering Centers - 273 Technology Transfer Centers - 317 Journals - 46 National Associations and Societies <ul><li>Division of Mathematics and Physics </li></ul><ul><li>Division of Chemistry </li></ul><ul><li>Division of Life Sciences and Medicine </li></ul><ul><li>Division of Earth Sciences </li></ul><ul><li>Division of Information Technical Sciences </li></ul><ul><li>Division of Technological Sciences </li></ul>Members 709 Foreign Members 53 Committee for Consultation and Review Committee on Scientific Ethics Committee for Science Popularization and Publication Institutions Directly under CAS - 94 Research Institutes - 2 Universities and Schools - 2 Supporting Units - 3 Botanical Gardens - 12 Branches - 2 Press and Publication Companies - 1 Assets Management Company - 22 Holding Enterprises Distributed in 22 provinces and cities over China. Overview of CAS
    6. 6. <ul><li>In 1998, with the approval of the Chinese government, the CAS launched the Knowledge Innovation Project (KIP). </li></ul><ul><li>In the process of the KIP, in recognition of the national strategic requirements and the world trend in science and technology, CAS has made the most profound and extensive adjustments in its disciplinary deployment and organizational structure since its founding in 1949. </li></ul>Knowledge Innovation Project (KIP)
    7. 7. Focus on quantitative evaluation Qualitative evaluation Ranking of institutes Classify by categories Annually five year evaluation, annually analysis with midterm adjustment Independently quantitative evaluation and peer-review Integration of quantitative quality assessment and peer-review Combination of self-evaluation & group discussion and evaluation simple external evaluation Combination of performance evaluation with process management and future development Performance evaluation only Development of Evaluation Concepts and Methods in CAS
    8. 8. Evaluation Process Communication review Peer review on-site review Quantitative Analysis Self-evaluation Decision Process Decision among group Final decision Framework of Comprehensive Evaluation (2005) Period: 5 years
    9. 9. 2001 - 2005 2006 - 2010 <ul><li>Self-Evaluation: to find out problems and disadvantages </li></ul><ul><li>Communication Review: Assessment on both the quality of research and management </li></ul><ul><li>Peer Review: Emphasize on output quality </li></ul><ul><li>Quantitative Analysis: Output performance and management </li></ul><ul><li>On-site Review: To realize diagnosis and guidance of the CAS headquarter to institutes </li></ul>Evaluation Process S&T outcomes Management Mechanism Development Strategy Planning for next 5 years
    10. 10. <ul><li>Providing evidence for the approval of 5-year strategic plan </li></ul><ul><li>Database for status and development analyzing </li></ul><ul><li>Related with resources allocation </li></ul><ul><li>Related with institutes directors’ salary </li></ul>Evaluation Result and its Application (Comprehensive evaluation, 2005) Result Application Number of Institutes Outstanding Good Acceptable Basic Research 5 8 1 High-Tech 9 18 2 Sustainable Development 12 23 2
    11. 11. Indicators Ability of developing the S&T productivity Ability of System Renovating Ability of leading innovation Ability of S&T Transferring the outcome Excellent scientists R&D inputs Research infrastructure S&T Outputs Institutional criteria Operational management Cooperation internally & externally Strategic decision Key S&T projects undertaken Pre-research of the cutting-age field Innovative Culture dissemination of science Technology Diffusion International Exchange International Cooperation Ability of Absorbing international resources Based on 24 fundamental indicators Quantitative Monitoring 1 Publications: papers, works 2 Academic talks 3 Awards 4 Patents 5 Copy right of software 6 New medicine, pesticide, new veterinary drugs , new fertilizer , new breed 7 Standards establishment 8 Consultation report 9 Technology transferring 10 Economic benefits of stock-owned enterprise 11 Social and economic benefits of technology transferring 12 graduates education 13 Postdoctoral training and academic visitors 14 Training and science popularization 15 Construction and maintenance large scale research equipment/platform 16 Significant innovation contribution 17 Project 18 Excellent research leaders 19 Infrastructure and capacity 20 Funds 21 Achievement of S&T goal 22 Policy implement 23 Building of innovation culture 24 Education
    12. 12. <ul><li>Adopted the calculation method of GDP index </li></ul><ul><li>Capable of Lateral and longitudinal comparison </li></ul><ul><ul><li>Lateral comparison provides CAS with information of institutes’ status </li></ul></ul><ul><ul><li>longitudinal comparison enables institutes’ self-monitoring </li></ul></ul><ul><li>Database supportive to analyzing overall development of CAS </li></ul>Quantitative Monitoring Formula
    13. 13. Indicators for S&T Management and Policy Guidance <ul><li>Execution of Strategic plan </li></ul><ul><li>S&T layout and adjustment </li></ul><ul><li>Projects advising and undertaking </li></ul><ul><li>Human resources </li></ul>Qualitative monitoring on essential management elements of institutes <ul><li>Promote communication among institutes with similar research fields </li></ul><ul><li>Managerial Control of institutes </li></ul><ul><li>Management innovation </li></ul><ul><li>S&T facility </li></ul><ul><li>Innovative culture </li></ul><ul><li>S&T outputs and their effects </li></ul>Annual self-evaluation form of the institutes <ul><li>8 aspects </li></ul><ul><li>31 indicators </li></ul>Qualitative Monitoring
    14. 14. Qualitative Monitoring Quantitative Monitoring Monitoring Result Status Report for each Institute
    15. 15. 2 Three-hierarchy Evaluation Model
    16. 16. Three-hierarchy Evaluation Model Yearly quantitative and qualitative monitoring Peer review At the bottom hierarchy are indicators that can reflect common characters of the institutes, including annual quantitative & qualitative monitoring. This part is the improvement of former monitoring essentially. The second hierarchy are the key indicators and benchmarks, which reflects the characters of certain category of institute. It can be used in both diagnosing evaluation for individual institute and comparison between institutes of the same category. Newly developed indicators reflect individual characteristics of each institute. The top hierarchy is qualitative evaluation by experts review, and it is also a hierarchy of individual evaluation. Using MPG and RIKEN as references, Each institute will have an advisory committee to provide suggestions for strategies and layouts periodically. This layer is individual for each institute. Quantitative Qualitative Key indicators
    17. 17. Key indicators of Pilot Institutes Common indicators: 1 、 S&T Talents 2 、 funding per FTE 3 、 Awards Institute of High Energy Physics Institute of Computing Technology Cold and Arid Regions Environmental & Engineering Research Institute Shanghai Institute of Technical Physics Dalian Institute of Chemical Physics Institute of Microbiology Institute of Physics Invited Speak High-quality papers Undertaking important tasks Outcome Transfer Building and running of large facilities High-quality papers Platform of resource and data Outcome transfer High-quality papers Undertaking Important tasks Platform of resource and data High-quality papers Outcome transfer Engineering Application and Demonstration
    18. 18. Benchmarks <ul><li>Based on analysis of the international research institutions and experimental analysis of the characteristics of pilot institutes. </li></ul><ul><li>The setting of benchmark: there are four methods in our research: </li></ul><ul><li>—— the more the better; </li></ul><ul><li>—— different standards based on the world’s criteria; </li></ul><ul><li>—— based on the development situation of institutes; </li></ul><ul><li>—— based on the experience. </li></ul>
    19. 19. 3 Conclusions and Discussions
    20. 20. Conclusions and Discussions(1) <ul><li>Conclusions </li></ul><ul><li>Key indicators and benchmarks regarding different types of institutes were proposed, including key quantitative indicators and benchmarks as well as key qualitative indicators and anchoring method. Thereby, three hierarchy evaluation model has been set up. </li></ul><ul><li>The new model put more emphases on the character of evaluation by category and individual features of institutes. </li></ul><ul><li>In summary, this model has developed Quality-orientated Comprehensive Evaluation system in methodology. </li></ul>
    21. 21. Conclusions and Discussions(2) <ul><li>Discussions </li></ul><ul><li>Institutes Classification: Including how to classify the institutes into different categories? What’s the proper scale for the category? Whether the demonstration of the category features allows the institute to choose different types evaluation scheme by themselves? </li></ul><ul><li>Key Indicators and Benchmarks: Should the benchmarks be set up on the basis of the individual features of the institute or should universal benchmarks be adopted in the institutes of the same category? </li></ul><ul><li>If key indicators and benchmarks be used in evaluation, how to reorganize management system at CAS level to meet the requirements of this evaluation method? Who can approve the key indicators and their benchmarks? </li></ul>
    22. 22. Thanks Contact information: [email_address]