Measuring S&T Performance at Environment Canada


Published on

Presentation given at PIPSC 2010, the 2nd Science Policy Symposium (May 12-14, 2010), Hilton Lac Leamy, Gatineau, by Eric Gagné, Acting Director / Science Policy Division / Science & Technology Branch /Environment Canada

Published in: Business, Technology, Education
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • I’ll first quick context as to why we undertook this performance measurement report Share results from our 2009 report on R&D performance, with a focus on the process and methods Provide update on work currently underway surrounding the measurement of related scientific activities Finally, I’ll conclude by discussing goals and opportunities for the federal S&T community associated with this ongoing work
  • EC undertook these activities for three main reasons 1- To respond to key commitments from EC Science Plan and the Federal S&T Strategy. 2- To better communicate the S&T story and its impacts 3- Better manage S&T as a horizontal cross-cutting activity
  • R&D – associated with the innovation chain and knowledge creation RSA - associated with “ public good” science – monitoring, risk assessments, scientific service delivery, surveys, standards development Basically, RSA is all S&T that is not R&D In measuring performance, there is generally more focus on R&D , because of its links to innovation and economic prosperity Refer to Benoit Godin’s important work on this… Though R&D and RSA are closely linked, it is difficult to measure how RSA contributes to the innovation system or to economic prosperity
  • EC is a large producer of both R&D and RSA 70% of our expenditures go towards S&T We also know that EC is the largest producer of environmental research in Canada , ranking 7 th in the world, according to an independent 2006 study by Science-Metrix StatCan data reveals that EC is the largest natural science RSA performer in Canada –
  • Current government reporting structures are vertical, but S&T is a horizontal activity Though EC has tree S&T strategic outcomes we did not have any mechanisms to measure S&T within the department Our approach to S&T measurement is a step towards overcoming this challenge
  • Because various components of S&T (R&D, RSA, Science policy) require different methodologies to assess performance, EC is undertaking a Phased approach. R&D was a good starting point, because of the readily-accessible indicators. We completed this Phase last December. With the experiences and competencies gained in the performance measurement field, we are now tackling RSA. I’ll talk about the process, in a few slide Phases 3 aims to take it one step further and look at how we manage our science. For example, are our current governance structure effective ? Basically, evaluating the science policy capacity. Gaining experience in measuring three components , the last step would be to integrate all three components into one report . This is currently a loafty goal – and we hope to achieve this vision one day.
  • To measure the R&D performance within EC, a new framework was created. The four principles of the framework (alignment, linkages, excellence, enabling the environment) are aligned with the Federal S&T Framework of 2005 and with EC Science Plan. Each principle implied the development of several key qualitative and quantitative indicators: Bibliometrics (collaborations, citation rates, etc.) Surveys Quantitative HR information Expenditures and personnel (statistics)
  • Qualitative and Quantitative data was used to assess indicators for each of the four Principles. I will provide an example for each principle in the following four slides For more details on results, see the report itself (on-line : “Managing S&T” )
  • To assess alignment, we surveyed R&D performers, users and funders across the Department to examine the timeliness and responsiveness of EC’s R&D This slide represents the links between performers and users. The arrows point from R&D producers to users and the thickness of lines represents the intensity of links (i.e. the number of times links between OPs were reported). The “ hubs” or circles happen to be priority areas for EC and the federal government. For example, people under the Risk Assessment group are both RD producers and users, for instance, they are exchanging R&D data with Aquatinc ecosystems protection and conservation group.
  • We have found that EC’s collaboration rate in publication is increasing, particularly through new international collaborations Though, national collaboration rates were already very high [Transition]: We were also able to map collaborations with key Canadian partners across all sectors
  • This graph demonstrate the collaboration between EC and collaborators using peer-reviewed publications and co-authorships. Thicker the lines or larger the circle, the greater the number of collaboration At close to 50%, universities account for the majority of collaborations Other important connections are with other major players in environmental research (University of Toronto, DFO, UBC)
  • Average relative citations index (ARC) was used to measure the impact of science ARC reflects the number of times a paper is cited compared to others in the same field On average, Canadian environmental research publications are cited 10% more often than the world average EC publications in this area are cited 40% more often than the world average This type of result demonstrate our contributions on the national and international environmental science field is strong
  • Our Enabling environment was rated as “fair” for several reasons. One was the Federal and EC intramural (i.e. in-house) R&D funding in O&M and salaries has remained relatively constant since 2004 . Funding for R&D infrastructure (brick and mortar equipment) appears to be ion the rise, however, these are usually for specific programs, but are not sufficient enough to address the general rust out issue we are all facing in the federal science community In addition, some new investments (not shown here) include: $14 million under Budget 2009 for modernizing EC laboratories $770K (through INAC) for upgrading Arctic research facilities Personnel: How can we mitigate risks of retirement – are we hiring enough people to fill the gap Categories for R&D Personnel based primarily on classifications
  • [Transition slide – read quickly] Focussed on R&D What about everything else we do in Federal S&T ? Let’s look at RSA.
  • Why measure RSA? To better account for $1.5B of expenditures on natural science RSA performed across the federal government. To better understand and communicate the role of “public good” science To situate RSA in the federal science and innovation system. To better manage RSA as a cross-cutting horizontal function, addressing potential gaps/challenges and taking advantage of opportunities. Basically We need to take stock of RSA and RSA performance
  • Taking a different approach than the R&D report, we create an opportunity to work with other SBDA on a product that would benefit the larger S&T community. EC hosted and facilitated a series of workshops that eventually lead to an outline of a white paper on RSA meausement . These workshops revealed common challenges but also new ideas on how to proceed. Botton line: Participants across SBDAs have agreed that this is important issue to work on – together.
  • Why are we here today? 1- We have lessons to share to anyone considering this type of work. 2- This type of tool has provided us with evidence that now allow us to “tell the S&T Story” more effectively to key stakeholders. 3- At EC, Before we launch the RSA assessment internally, we would like to ensure that the tool box we create is transferable to the entire science community . So that by using the same tools, we can share best practices and learn from one another. Your contribution to the draft White Paper on GCPedia will be greatly appreciated until the end of May.
  • Measuring S&T Performance at Environment Canada

    1. 1. Measuring S&T Performance at Environment Canada Eric Gagné A/ Director, Science Policy Division Environment Canada PIPSC Science Policy Symposium May 13, 2010
    2. 2. Outline <ul><li>Context </li></ul><ul><ul><li>Why measure S&T performance? </li></ul></ul><ul><ul><li>Overview of S&T at Environment Canada (EC) </li></ul></ul><ul><li>Measuring R&D performance: Methods and results from the first report (2009) </li></ul><ul><ul><li>Available on-line at (under “Managing S&T”) </li></ul></ul><ul><li>Measuring Related Scientific Activities (RSA) performance: A work in progress </li></ul>Measuring the Performance of Related Scientific Activities ( 2009 ) ( 2011 )
    3. 3. Why measure S&T performance at EC? <ul><li>EC Science Plan commitments (2007): “…performance measures will provide information to monitor progress, inform future editions of the Plan, and ensure a process of continuous improvement within Environment Canada’s science.” </li></ul><ul><li>Federal S&T Strategy (2007): “ Canada’s federal government will increase its accountability to Canadians by improving its ability to measure and report on the impacts of its S&T expenditures.” </li></ul><ul><li>Better management of horizontal S&T issues </li></ul><ul><li>Strategic positioning of S&T to inform large-scale reviews / audits and to communicate EC S&T activities. </li></ul>
    4. 4. R&D / RSA Overview <ul><li>associated with “ public good” science – monitoring, risk assessments, scientific service delivery, surveys, standards development, etc. </li></ul>Research & Development : “ Creative work undertaken on a systematic basis in order to increase the stock of scientific and technical knowledge and to use this knowledge in new applications. ” Related Scientific Activities : “ Those activities which complement and extend R&D by contributing to the generation, dissemination and application of scientific and technological knowledge .” <ul><li>R&D and RSA have distinct functions but are highly interconnected and inform one another. </li></ul><ul><li>As defined by the OECD / Statistics Canada: </li></ul>associated with the innovation chain and new knowledge creation
    5. 5. Overview of R&D / RSA at EC Science-Metrix (2006) RSA ~45% of EC expenditures ($450M) and ~40% of personnel (2500 full-time equivalents). R&D ~25% of EC expenditures ($250M) and ~15% personnel (1000 full-time equivalents) EC is the largest performer of natural science RSA in the federal government. EC is one of the top 10 environmental R&D institutions in the world, and the top producer of environmental research in Canada (2006). Directly allows EC to deliver on its mandate, through monitoring, testing, prediction, etc Provides the basic knowledge, credibility to support EC’s regulations, policies, programs, etc. Related Scientific Activities (RSA) R&D
    6. 6. Challenges for measuring S&T at EC <ul><li>All Strategic Outcomes, and Program Activities rely on S&T </li></ul><ul><li>With current reporting structures, the challenge is to measure S&T as a horizontal activity </li></ul>Environment Canada’s Program activity architecture (PAA)
    7. 7. Phased approach to measuring S&T <ul><li>Phase 1 : Measurement of R&D performance, using readily accessible quantitative and qualitative indicators </li></ul><ul><ul><li>EC developed a logic model and performance measurement framework for R&D, using four principles articulated in the EC Science Plan </li></ul></ul><ul><ul><li>The 2009 report is the first of its kind at EC. </li></ul></ul><ul><li>Phase 2 : Measurement of RSA performance </li></ul><ul><ul><li>Lead by EC, the Interdepartmental Working Group is developing a new logic model, framework and preliminary set of indicators to measure federal RSA. </li></ul></ul><ul><ul><li>Report on RSA at EC to be published in 2011. </li></ul></ul><ul><li>Phase 3 : Measurement of EC’s S&T management and governance (S&T Policy) </li></ul><ul><li>Phase 4 : Integration of S&T and policy performance measures </li></ul>COMPLETED IN PROGRESS PLANNED
    8. 8. Phase 1: Measuring R&D performance <ul><li>Reviewing current best practices and in collaboration with scientists, managers, performance measurement experts across EC, we developed a framework. </li></ul><ul><li>Each principle implied the development of several key qualitative and quantitative indicators: </li></ul><ul><ul><li>Bibliometrics (collaborations, citation rates, etc.) </li></ul></ul><ul><ul><li>Surveys </li></ul></ul><ul><ul><li>Quantitative HR information </li></ul></ul><ul><ul><li>Expenditures and personnel (StatCan) </li></ul></ul>EC Framework for R&D Performance Measurement : “Managing S&T”
    9. 9. A “dashboard” performance measurement of R&D : “Managing S&T”
    10. 10. Alignment: Linking R&D with users <ul><li>Strong horizontal links between R&D users and producers at EC </li></ul><ul><li>R&D “hubs” often correspond to government-wide priorities </li></ul><ul><li>Internal functions (policy, regulation, etc.) as the main users of EC’s R&D, found it both timely and responsive </li></ul>R&D links across EC’s PAA showing the connections between producers and users of R&D (2009) : “Managing S&T” Risks assessment Aquatic ecosystems protection / conservation Environmental monitoring Strategic approaches Governance and policy coordination of ecosystem approach Wildlife protection / conservation Risk management Risk mitigation and implementation Weather and environmental prediction research Information on pollutants and GHGs Conservation of the environment Environmental protection Internal services Weather and climate services
    11. 11. Linkages: Increasing collaboration rates <ul><li>We revealed collaborations with key sectors, geographic areas and scientific specialties </li></ul><ul><li>We found strong connections with universities in particular (over half of all papers published) </li></ul><ul><li>Collaboration greatly increases the number of citations a paper receives </li></ul>EC’s collaboration rates (2003-2007) : “Managing S&T”
    12. 12. Collaborations between EC branches / directorates and with partners (Figure produced by Science-Metrix, 2009) Linkages: National collaboration networks Note: Thickness of lines and size of circles (EC Directorates / Branches) indicate the number of collaborations : “Managing S&T”
    13. 13. Excellence: Benchmarking productivity and impact <ul><li>EC environmental research publications area are cited 40% more often than the world average </li></ul><ul><li>EC benefits from particularly high levels of specialization in certain environmental research sub-fields </li></ul><ul><li>Many high-impact publications in first-rate interdisciplinary journals </li></ul> : “Managing S&T”
    14. 14. Enabling Environment: cross-cutting inputs that make a difference EC’s R&D expenditures (current dollars) EC’s R&D Capital Expenditures (current dollars) R&D personnel at EC <ul><li>Constant intramural expenditures </li></ul><ul><li>Increasing number of R&D personnel </li></ul><ul><li>Recent increases in capital spending </li></ul> : “Managing S&T”
    15. 15. Next steps: moving the “yardstick” forward <ul><li>Our tools for R&D measurement are largely transferable across science-based departments and agencies </li></ul><ul><li>What about the rest of federal S&T? </li></ul><ul><li>How can we adapt and extend existing tools and approaches to measure related scientific activities ? </li></ul>Federal RSA / R&D expenditures (current dollars) <ul><li>Towards RSA performance measurement… </li></ul>
    16. 16. Phase 2: Why measure RSA? <ul><li>To better account for $1.5B of expenditures on natural science RSA performed across the federal government </li></ul><ul><li>To better understand “public good” S&T – RSA is almost exclusively performed by the governments and often directly supports science-based regulations / legislation, services to Canadians, etc </li></ul><ul><li>To better situate RSA in the federal S&T and innovation system </li></ul><ul><li>To better manage RSA as a cross-cutting horizontal function, addressing potential gaps/challenges and taking advantage of opportunities </li></ul>Top federal spenders on intramural, natural sciences and engineering RSA (2006-07) : “Related Scientific Activities” 24% Environment Canada
    17. 17. A collaborative approach to develop a functional toolbox <ul><li>Measuring RSA is a government-wide challenge </li></ul><ul><li>Our goal is to develop tools to measure RSA that will benefit the entire federal S&T community </li></ul><ul><li>EC created a working group to consult with evaluators, scientists, policy analysts, managers and corporate planners across SBDAs </li></ul><ul><li>An open space for collaboration is provided on GCPedia </li></ul><ul><li>Identify RSA performance measurement drivers (Management, Horizontality and Accountability), applications and intended audience. </li></ul><ul><li>Provide background information on RSA (overall, within the federal government and within EC) </li></ul><ul><li>Lay out the RSA Measurement Framework and Logic Model </li></ul><ul><li>Introduce possible RSA Indicators , their range of applicability and how to use them </li></ul><ul><li>Indicate a path forward for developing and using tailored performance measures </li></ul>Process Expected results : “Related Scientific Activities”
    18. 18. Summary <ul><li>Expected result: </li></ul><ul><li>A 2011 report on EC’s RSA performance that will contribute to better accounting for the “other half” of federal S&T </li></ul><ul><li>Developing an RSA measurement toolbox, in collaboration with other Departments </li></ul><ul><li>Result: </li></ul><ul><li>Demonstration of EC’s performance based on four framework elements </li></ul><ul><li>Set of transferable tools for measuring RD&D developed </li></ul><ul><li>The 2009 report: Measuring Environment Canada’s R&D Performance </li></ul>