Evidence-Based Policy Session01/10/2012                            CIFOR Annual Meeting 2012
Evidence-Based Policy Session             Moving towards ‘Evidence-Based                    Forestry’ in CIFOR            ...
Context of science for policy      Current ‘haphazard’ situation             Total body of             research           ...
Current ‘haphazard’ situation        Evidence Based Forestry               X             Total body of research           ...
Current ‘haphazard’                                            Evidence Based             situation                       ...
The knowledge-use challenge01/10/2012                          CIFOR Annual Meeting 2012
Model - evidence based medicine (EBM)                                       Expert                                       o...
Model - evidence based forestry (EBF)                                           Expert                                    ...
But what are the priorities?     • Who decides what the priority agenda is?             – Science research             – P...
Collaboration –                            asking the most useful                            questions      Collaboration ...
Identifying research questions:                       T10Q Project01/10/2012                              CIFOR Annual Mee...
1.                     Social                      Environ         456     Online survey                  questions       ...
• Self-selected                             delegates                           • Delphi process                          ...
Environment            Social                Economics        692 questions       456 questions           446 questions 1....
T10Q Top Ten Questions             1. What are the most technically and cost effective                ways of identifying,...
Checking relevance and validity     Numbers of papers from EU vs.                           Papers published in the last 5...
Reflections on the process         • Survey tool effective for collecting and sharing large amount of           informatio...
Model - evidence based forestry (EBF)                                      Expert                     Best             opi...
Problem: Evaluating ALL literature:• Not all data and information are on the Web• Not all data and information are free• S...
Problem: Evaluating ALL the literature                  of relevance to forest carbon01/10/2012                           ...
Problem: bias in literature base• Publication bias                                            •   Reviewer bias• Language ...
Problem: bias in literature base                     A hierarchy of evidence01/10/2012                                    ...
Framework for incorporating evidence  Question framing  • Involving            Explicit    stakeholders        question  •...
Collaboration –                            asking the most useful                            questions      Collaboration ...
Rigour can be applied to ALL outputs that          aim to summarise science  Question framing  • Involving            Expl...
Reviewing the evidence                •   Set by decision-makers                •                                    Expli...
Who has adopted        Systematic Reviewing as ‘Gold Standard’?01/10/2012                               CIFOR Annual Meeti...
Stages of a review - Protocol01/10/2012                            CIFOR Annual Meeting 2012
REDD+ and carbon measurement   •REDD+ actions rewarded to their value of   tons of carbon mitigated   •Credits globally va...
Don’t we already know how to assess                      carbon?      Asner et al.,2011. A universal airborne LiDAR      a...
Exciting claims as science progresses     “Using arguably the world’s most intensively studied tropical forest plot (STRI’...
Importance of question framing     • Brainstorming FAO, March        Sub-questions       2009                            •...
Peer-reviewed studies not the whole storyValuable forestry & environmental data & information  are in ‘grey’ (‘fugitive’) ...
Valuable evidence from older studies              Your library or research institution may have             essential info...
Systematic review of ALL evidence-     provided it meets agreed criteria for inclusion    • What characteristics of studie...
Statistics for retrieved papers       from subscription       after title   bibliographic databases    assessment         ...
After title assessment               4,531                                                                   After abst ->...
Preliminary findings     • There is seemingly a large body of literature on       comparison of methods.     • But very fe...
Tentative conclusions     •       Measuring and monitoring forest carbon accurately and reliably is absolute             r...
Embracing EBF – typologies we             encountered and will encounter     • Early rejecter: The person/organisation who...
Ties in with agenda-setting for research       Research questions           Knowledge gaps                    Collaboratio...
Why should CIFOR introduce EBF?    • Fits the organisation’s aims    • Promotes collaboration &encourages partnership    •...
• “Evidence-based forestry is the      conscientious, explicit, and judicious use of current      best evidence in making ...
Rigour can be applied to ALL outputs that          aim to summarise science  Question framing  • Involving            Expl...
A hierarchy of evidence01/10/2012                             CIFOR Annual Meeting 2012
Incremental changes01/10/2012                 CIFOR Annual Meeting 2012
Incremental changes01/10/2012                 CIFOR Annual Meeting 2012
Incremental changes01/10/2012                 CIFOR Annual Meeting 2012
Incremental changes01/10/2012                 CIFOR Annual Meeting 2012
Incremental changes01/10/2012                 CIFOR Annual Meeting 2012
Incremental changes01/10/2012                 CIFOR Annual Meeting 2012
Incremental changes01/10/2012                 CIFOR Annual Meeting 2012
Collaboration & evidence-based approach can improve forestry research !      Evidence Based                               ...
Upcoming SlideShare
Loading in...5
×

Cifor annual meeting_ebf_gp3

143

Published on

Th potential for evidence-based forestry to improve policy-making in forestry and enhance the scientific knowledge base in forestry

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
143
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • V2. removed ‘including’ from RH box
  • V2. removed ‘including’ from RH box
  • V2. removed ‘including’ from RH box
  • The aim is to use good information (the best evidence) well – disseminated widely and in ways that reach the target audiences;Ther are plenty of examples of missedopportunities – good information simply is not disseminated or picked up by those who need it.Bad information/weak evidence is everywhereBut be especially aware of the danger of poor information (weak evidence) that is used in ways that look good and that have real impact. Poor information used ‘well’ is the scourge of policy and decision-making.
  • Important elements of systematic review are:Formulating a clear question – that is needed by decision-makers (i.e. not a ‘blue-skies’ type of primary research question)Retrieving all relevant information to answer the question, while minimising biasExtracting relevant information/data in a way that minimises biasActively seeking widespread ‘peer-review’ from networks – e.g. Evidence-based forestry Google group/Climate org list serv, IUFRO, etc.Actively disseminating the findings and agreeing a plan for re-review later
  • Important elements of systematic review are:Formulating a clear question – that is needed by decision-makers (i.e. not a ‘blue-skies’ type of primary research question)Retrieving all relevant information to answer the question, while minimising biasExtracting relevant information/data in a way that minimises biasActively seeking widespread ‘peer-review’ from networks – e.g. Evidence-based forestry Google group/Climate org list serv, IUFRO, etc.Actively disseminating the findings and agreeing a plan for re-review later
  • Hand-searching for references, often suggested by wider collaborators and networks of people reviewing the systematic review at all stages, and including raw data from ,e.g. Forest inventories (b&w pic shows training session by Dr. S.V. Belov of the Leningrad Forest Research Institute, interpreting aerial photos, 1963).
  • All evidence, from whatever source, must be judged by the answers to these five questions, which are outlined fully in the review Protocol, a document which is written for every systematic review before the review is undertaken. The Protocol is reviewed externally as widely as possible and all changes are tracked. The process throughout the review is transparent and open to feebdack.
  • V2. removed ‘including’ from RH box
  • Transcript of "Cifor annual meeting_ebf_gp3"

    1. 1. Evidence-Based Policy Session01/10/2012 CIFOR Annual Meeting 2012
    2. 2. Evidence-Based Policy Session Moving towards ‘Evidence-Based Forestry’ in CIFOR Gillian Petrokofsky University of Oxford gillian.petrokofsky@zoo.ox.ac.uk01/10/2012 CIFOR Annual Meeting 2012
    3. 3. Context of science for policy Current ‘haphazard’ situation Total body of research Research used No clear methods for accessing or analysing research used for decision-making .01/10/2012 CIFOR Annual Meeting 2012
    4. 4. Current ‘haphazard’ situation Evidence Based Forestry X Total body of research Total body of research Research used Research used. Robust, ‘scientific’ methods for No clear methods for accessing or accessing and analysing research analysing research used for decision-making . used for decision-making.01/10/2012 CIFOR Annual Meeting 2012
    5. 5. Current ‘haphazard’ Evidence Based situation Forestry Total body of research Total body of research Collaboration/Participation Research used in: Research used.  Defining research agenda  Systematic reviews No clear methods for accessing or Repeatable methods for accessing analysing research used for and analysing research used for decision-making . Results not decision-making. Results actively disseminated. disseminated actively.01/10/2012 CIFOR Annual Meeting 2012
    6. 6. The knowledge-use challenge01/10/2012 CIFOR Annual Meeting 2012
    7. 7. Model - evidence based medicine (EBM) Expert opinion Best science EBM Individual need & preference01/10/2012 CIFOR Annual Meeting 2012
    8. 8. Model - evidence based forestry (EBF) Expert opinion Best science EBF Society’s needs & preference s01/10/2012 CIFOR Annual Meeting 2012
    9. 9. But what are the priorities? • Who decides what the priority agenda is? – Science research – Policy – Policy push/science pull – Science-driven • Collaborative conversations01/10/2012 CIFOR Annual Meeting 2012
    10. 10. Collaboration – asking the most useful questions Collaboration – telling people what Collaboration – you discovered finding the most useful evidence Collaboration – Collaboration – widespread feedback and agreeing on how to peer-review at all stages analyse the evidence01/10/2012 CIFOR Annual Meeting 2012
    11. 11. Identifying research questions: T10Q Project01/10/2012 CIFOR Annual Meeting 2012
    12. 12. 1. Social Environ 456 Online survey questions 692 questions 1594 questions Economic 446 2. questions Two-day Workshop Top Ten Questions01/10/2012 CIFOR Annual Meeting 2012
    13. 13. • Self-selected delegates • Delphi process • Discussion • Voting • Final list of 10 questions • Collaborative peer- reviewed paperPictures courtesy Steven Heathcote
    14. 14. Environment Social Economics 692 questions 456 questions 446 questions 1. Forest economics & trade 8. Carbon sequestration, carbon 2. Forest cycle management, silviculture & 9. Afforestation & forest forest operations plantations 3. Ecosystem services 10. Soil and water 4. Biodiversity & conservation 11.Pests, diseases & invasives 5. Climate change & global 12.Urban forestry, urban warming trees, arboriculture 6. Decision-making & public 13.Land use & landscape opinion 14.Miscellaneous 7. Biofuel, energy from biomass01/10/2012 CIFOR Annual Meeting 2012
    15. 15. T10Q Top Ten Questions 1. What are the most technically and cost effective ways of identifying, monitoring, and controlling invasive species, pests and disease? 2. How can we achieve better understanding between foresters and other parts of society? 3. What are the most effective landscape planting schemes to ensure connectivity between woodland fragments whilst maintaining connectivity between other landuse types 4. What is the value of forestry to human health and well-being? 5. Who are the private woodland owners and how can they be engaged and influenced? What are their concerns?01/10/2012 CIFOR Annual Meeting 2012
    16. 16. Checking relevance and validity Numbers of papers from EU vs. Papers published in the last 5 global numbers published years Petrokofsky, G, ND Brown, GE Hemery . Matching a scientific knowledge base with stakeholders needs. The T10Q project as a case study for forestry. Forest Policy & Economics01/10/2012 CIFOR Annual Meeting 2012
    17. 17. Reflections on the process • Survey tool effective for collecting and sharing large amount of information • Contributes to reduction in potential bias/perceived biases  engaging with different ‘stakeholders’  enabling ‘bottom-up’ collaborative discussion • Focussed workshop new experience for some participants • Delphi ‘experts on tap’ not ‘on top’ • Strengths & weaknesses of voting • Difficulties of common understanding • Generates themes – for Cochrane-style systematic reviews and review groups01/10/2012 CIFOR Annual Meeting 2012
    18. 18. Model - evidence based forestry (EBF) Expert Best opinion science EBF Society’s needs & preference s01/10/2012 CIFOR Annual Meeting 2012
    19. 19. Problem: Evaluating ALL literature:• Not all data and information are on the Web• Not all data and information are free• Science is not only in English-language publications• Evidence does not come only from peer-reviewed journals• Much research, especially that with negative or inconclusive results, may fail to be published in journals. Carefully collected data are thus ‘lost’• There are too many publications for an individual to find and assess01/10/2012 CIFOR Annual Meeting 2012
    20. 20. Problem: Evaluating ALL the literature of relevance to forest carbon01/10/2012 CIFOR Annual Meeting 2012
    21. 21. Problem: bias in literature base• Publication bias • Reviewer bias• Language bias • Quality Assessment• Regional & Developed • Reporting bias Country bias • Methodological bias• Funding bias • Outcome Variable Selection• Database bias & Within-Study Reporting• Regional & Developed bias Country bias Chalmers (2003) Trying to do more Good than Harm in Policy and Practice: The Role of Rigorous, Transparent, Up-to-Date Evaluations .Annals of the American Academy of Political and Social Science, Vol. 589, Misleading Evidence and Evidence-Led Policy: Making Social Science More Experimental. (Sep., 2003), pp. 22-40.01/10/2012 CIFOR Annual Meeting 2012
    22. 22. Problem: bias in literature base A hierarchy of evidence01/10/2012 CIFOR Annual Meeting 2012
    23. 23. Framework for incorporating evidence Question framing • Involving Explicit stakeholders question • Define what is to Commitment to update be examined and Systematic how evaluation of • Rigorous review evidence methodology • Transparent Active • Repeatable dissemination of results • To all stakeholders (and decision-makers) • Appropriate formats for different end users01/10/2012 CIFOR Annual Meeting 2012
    24. 24. Collaboration – asking the most useful questions Collaboration – telling people what Collaboration – you discovered finding the most useful evidence Collaboration – Collaboration – widespread feedback and agreeing on how to peer-review at all stages analyse the evidence01/10/2012 CIFOR Annual Meeting 2012
    25. 25. Rigour can be applied to ALL outputs that aim to summarise science Question framing • Involving Explicit stakeholders question Systematic • Define what is to Commitment to update evaluation be examined and of how evidence • Rigorous review methodology Active • Transparent dissemination • Repeatable of results • To all stakeholders (and decision-makers) • Appropriate formats for different end users01/10/2012 CIFOR Annual Meeting 2012
    26. 26. Reviewing the evidence • Set by decision-makers • Explicit Involving stakeholders • question Define what is to be examined and how Commitment to update Systematic evaluation of evidence • Rigorous review methodology • Active To all stakeholders (including decision-makers) dissemination • Transparent • Appropriate formats for different end users of results • Repeatable01/10/2012 CIFOR Annual Meeting 2012
    27. 27. Who has adopted Systematic Reviewing as ‘Gold Standard’?01/10/2012 CIFOR Annual Meeting 2012
    28. 28. Stages of a review - Protocol01/10/2012 CIFOR Annual Meeting 2012
    29. 29. REDD+ and carbon measurement •REDD+ actions rewarded to their value of tons of carbon mitigated •Credits globally valued at US$126 billion in 20081 •Will require accurate, credible mensuration of forest carbon •Current national forest monitoring systems often of poor quality2 (1Capoor and Ambrosi 2009 2 Holmgren & Marklund, 2007)01/10/2012 CIFOR Annual Meeting 2012
    30. 30. Don’t we already know how to assess carbon? Asner et al.,2011. A universal airborne LiDAR approach for tropical forest carbon mapping. Oecologia doi 10.1007/s00442-011-2165-z01/10/2012 CIFOR Annual Meeting 2012
    31. 31. Exciting claims as science progresses “Using arguably the world’s most intensively studied tropical forest plot (STRI’s 50-ha plot at Barro Colorado Island, Panama), Mascaro et al find that lidar-based uncertainties of aboveground carbon stocks are indistinguishable from errors obtained when doing the most detailed plot-based estimates.”01/10/2012 CIFOR Annual Meeting 2012
    32. 32. Importance of question framing • Brainstorming FAO, March Sub-questions 2009 • How accurate, precise and • Refining, Bonn UNFCC, June repeatable are: 2009 • methods used for the conversion • Peer-review, more refining of in situ measurements into carbon stock estimates at the site level? • methods for generating carbon How do current methods stock estimates for larger compare in their ability to geographical areas (landscape measure and assess terrestrial level) from site-level data? carbon stocks and changes in carbon stocks with • direct remote sensing accuracy, precision and methodologies for estimating repeatability? carbon stocks?01/10/2012 CIFOR Annual Meeting 2012
    33. 33. Peer-reviewed studies not the whole storyValuable forestry & environmental data & information are in ‘grey’ (‘fugitive’) literature. – reports – working papers published independently – occasional papers by organizations – spreadsheets on websites – conference papersinfrequently indexed in bibliographic databasesinadequately retrieved by search engines01/10/2012 CIFOR Annual Meeting 2012
    34. 34. Valuable evidence from older studies Your library or research institution may have essential information – collaborate to maximise evidence base01/10/2012 CIFOR Annual Meeting 2012
    35. 35. Systematic review of ALL evidence- provided it meets agreed criteria for inclusion • What characteristics of studies will be used to determine whether a particular piece of evidence is relevant to the topic of interest? • What characteristics of studies will lead to their exclusion? • Will relevance decisions be based on a reading of report titles, abstracts or full reports? • Who will make the relevance decisions? • How will the reliability of relevance decisions be assessed?01/10/2012 CIFOR Annual Meeting 2012
    36. 36. Statistics for retrieved papers from subscription after title bibliographic databases assessment 50,841 4,344 Very significant amounts of valuable information is locked behind a from free databases and subscription firewall – no organization web sites one Institution can afford to get ‘the whole picture’ 6,279 67101/10/2012 CIFOR Annual Meeting 2012
    37. 37. After title assessment 4,531 After abst -> Category 2 650 After abst -> Category 1 300 ForestForest papers fall into these seven types - comparisons between:1. different biomass equation forms2. Biomass Estimation Factors and biomass equations.3. different biomass equations against general biomass equations4. different sampling / measurement techniques for dead wood5. vegetation models and inversion techniques6. eddy data and process-based models7. different growth models to estimate carbon• Category 1 papers (comparative studies) tend to be high in quality although many do not provide statistics on relative precision/uncertainty but instead report on correlations between the two methods.• Studies often focus on the validity of a newer method by showing relationship to more conventional methods but often do not shed light on relative uncertainty, costs, etc.• Many studies in type 1 to 3, but most of them too specific spatially• There are few repetitions of studies that look at any given model01/10/2012 CIFOR Annual Meeting 2012
    38. 38. Preliminary findings • There is seemingly a large body of literature on comparison of methods. • But very few papers apply a methodology which tests one method against another in one location at one time to make robust conclusions about accuracy or repeatability or affordability of a given method. This may be contrary to the popular belief that the science is pretty well agreed upon. • Bottom line: Evidence appears to be scarce on comparative advantages of different methods used to measure carbon.01/10/2012 CIFOR Annual Meeting 2012
    39. 39. Tentative conclusions • Measuring and monitoring forest carbon accurately and reliably is absolute requirement for success of REDD+ • REDD+ is at a critical stage of development which coincides with a time of increased public scepticism in climate science • For REDD+ emission activities to be credible, the national monitoring systems need to be evidence based • The systematic review of carbon measurements will provide a transparent and readily-repeatable evidence base which can support decision-making in an important area of climate mitigation01/10/2012 CIFOR Annual Meeting 2012
    40. 40. Embracing EBF – typologies we encountered and will encounter • Early rejecter: The person/organisation who strongly believes that the current way is the best one because they have a vested interest (intellectual or economic) in promoting the method (the analogy with health here is the drug company which currently has the contract for supplying an effective medicine); • Over-eager adopter: The person/organisation who strongly believes that the current way is not the best one because they have a vested interest (intellectual or economic) in promoting an alternative method - often their own - (the medical analogy is a competing drug company wanting to break into the market); • Methodological sceptic: The person/organisation who has an open mind about the current way and wants to test it. • paraphrasing some of Rogers’ (1962) adopter categories:01/10/2012 CIFOR Annual Meeting 2012
    41. 41. Ties in with agenda-setting for research Research questions Knowledge gaps Collaboration01/10/2012 CIFOR Annual Meeting 2012
    42. 42. Why should CIFOR introduce EBF? • Fits the organisation’s aims • Promotes collaboration &encourages partnership • Collaborators from Institutes with poor literature resources get access to scientific publications • North-South - knowledge & skills sharing • Scientists here are already doing some of it • It is good for careers – peer-reviewed publications01/10/2012 CIFOR Annual Meeting 2012
    43. 43. • “Evidence-based forestry is the conscientious, explicit, and judicious use of current best evidence in making decisions to enhance provision of products and services from forest resources. It recognizes that forest resource management is context specific, ever- changing, and involves uncertainties, and that the best evidence is derived from a systematic process which aims to minimise bias.” • (after Sackett 1996 & McKibbon 1998)01/10/2012 CIFOR Annual Meeting 2012
    44. 44. Rigour can be applied to ALL outputs that aim to summarise science Question framing • Involving Explicit stakeholders question Systematic • Define what is to Commitment to update evaluation be examined and of how evidence • Rigorous review methodology Active • Transparent dissemination • Repeatable of results • To all stakeholders (and decision-makers) • Appropriate formats for different end users01/10/2012 CIFOR Annual Meeting 2012
    45. 45. A hierarchy of evidence01/10/2012 CIFOR Annual Meeting 2012
    46. 46. Incremental changes01/10/2012 CIFOR Annual Meeting 2012
    47. 47. Incremental changes01/10/2012 CIFOR Annual Meeting 2012
    48. 48. Incremental changes01/10/2012 CIFOR Annual Meeting 2012
    49. 49. Incremental changes01/10/2012 CIFOR Annual Meeting 2012
    50. 50. Incremental changes01/10/2012 CIFOR Annual Meeting 2012
    51. 51. Incremental changes01/10/2012 CIFOR Annual Meeting 2012
    52. 52. Incremental changes01/10/2012 CIFOR Annual Meeting 2012
    53. 53. Collaboration & evidence-based approach can improve forestry research ! Evidence Based Forestry CIFOR/Oxford EFB work • Representatives from Total body of research  key programmes • Enthusiasm Research used. • Resources available Thank you for your Robust, ‘scientific’ methods for attention! accessing and analysing research used for decision-making. Results disseminated actively.01/10/2012 CIFOR Annual Meeting 2012
    1. A particular slide catching your eye?

      Clipping is a handy way to collect important slides you want to go back to later.

    ×