GDNet M&E Report 2014 – Year 3

457 views
386 views

Published on

This document provides the annual progress report (Year 3) and update to GDNet’s Baseline and M&E Framework. The report covers the period January to December 2013, with data presented up to April 2014 where it is relevant and available. The document is structured according to the GDNet logframe – with separate chapters from the Outcome-level down through Outputs 1 to 4. A box summarizing the progress against the logframe indicators in Year 3 is provided at the beginning of each chapter. GDNet will close in June 30th 2014. Hence this is the final M&E report.

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
457
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

GDNet M&E Report 2014 – Year 3

  1. 1. GDNet M&E Report 2014 – Year 3 Robbie Gregorowski, Jodie Ellis and Cheryl Brown May 2014
  2. 2. i Table of Contents Executive Summary.................................................................................................................... ii Introduction................................................................................................................................1 Outcome Level – Diverse research and policy audiences make better use of development research from the global south..................................................................................................2 Output 1 - Southern research better informed by current ideas and knowledge ....................7 Output 2 - Researchers better able to communicate their research to policy........................18 Output 3 - Knowledge networking between researchers and with policy actors increased...30 Output 4 - Lessons about knowledge brokering best practice in the global south learnt and communicated..........................................................................................................................39 Annexes (Volume II) Annex 1: GDNet Year 3 web statistics Annex 2: GDNet user base annual web survey questionnaire – Year 3 Annex 3: GDNet user base annual web survey results –Year 3 Annex 4: Analysis of web survey responses from GDNet users from the Global South Annex 5: Cases of knowledge into use in the policy process – Baseline – Year 2 cases Annex 6: Output 3 indicator 1 – GDNet ‘user base’ interaction – log Annex 7: GDNet Social Media Annual Stat Report for Year 3 (2013) Annex 8: Output 4 indicator 1 – Generation of best practice lessons – log Annex 9: Participants’ Policy Brief Analysis Annex 10: Checklist for editorial review of participants' policy briefs
  3. 3. ii Executive Summary This document provides the annual progress report (Year 3) and update to GDNet’s Baseline and M&E Framework. The report covers the period January to December 2013, with data presented up to April 2014 where it is relevant and available. The document is structured according to the GDNet logframe – with separate chapters from the Outcome-level down through Outputs 1 to 4. A box summarising the progress against the logframe indicators in Year 3 is provided at the beginning of each chapter. GDNet will close in June 2014. Consequently, in the second half of Year 3 the GDNet have focussed on three priorities:  A well-ordered closure of the programme, ensuring that the targets set for the programme in the logframe are either met or exceeded;  Documenting the lessons on knowledge brokering best practice they have generated over the life of the programme; and,  Ensuring that GDNet leaves behind an accessible and sustainable legacy. GDNet has been successful in meeting all three priorities. A well-ordered closure of the programme Below is a summary of the progress made against the final set of indicator targets set out in the logframe:  Outcome Indicator 1 - Exceeded target by 9 percentage points  Outcome Indicator 2 – Met target  Output 1 Indicator 1 - Met target. Exceeded target in terms of use. Met target in terms of satisfaction.  Output 1 Indicator 2 - Met target  Output 2 Indicator 1 – Confidence – Exceeded target / Ability – Met target  Output 2 Indicator 2 – Met target  Output 3 Indicator 1 – User to user interaction - Met target / Events – Met target  Output 3 Indicator 2 – Exceeded target  Output 4 Indicator 1 – On track to meet target (June 2014)  Output 4 Indicator 2 - On track to meet target (June 2014) GDNet has met or exceeded all of its January 2014 logframe targets and is on track to meet is June 2014 targets. A short elaboration of each outcome and indicator is provided below: Outcome Indicator 1 – Southern researchers’ use of other southern research in own research The GDNet user base annual web survey represents a key component of the M&E approach and has been conducted annually since the baseline. Although the response rate is not particularly low for web surveys of its type – attempting to engage what is a diffuse and generally passive user group - reasons for the low response rate may relate to the general proliferation of online surveys (‘survey fatigue’) which for Year 3 cannot have been helped by the user base being informed that the programme is closing in June 2014 and hence having little incentive to contribute. Target (January 2014): At least 60% of Southern researchers consulted by the programme use Southern research in their own research to a moderate or great extent. Year 3 Progress: - 69% of respondents use Southern research to a great or moderate extent, an increase of 5 percentage points from Year 2. Exceeded Target by 9 percentage points.
  4. 4. iii The Year 3 web survey provided a number of interesting insights into the nature and perceptions of Southern research by Southern users and how they tend to combine use of Northern and Southern research in a complementary manner. From analysis of the survey data, it is possible that women are less likely than men to distinguish between Northern and Southern research but the difference is not enough to be significant (given the smaller sample size of responses from women). Outcome Indicator 2 – Cases of knowledge into use in policy processes No cases were developed in Year 3 but existing cases were updated. Reflecting on the case development process over three years illustrates that GDNet has engaged a wealth of innovative, informed and highly motivated researchers. The cases GDNet developed highlight what makes Southern research unique – its practicality and innovativeness - something that more traditional Western research may do well to learn from. Target (January 2014): A portfolio of 20 up-to-date cases of knowledge into use in the policy process (5 per year) Year 3 Progress: 21 cases developed, validated and updated throughout the life of the programme. Met target. As well as providing GDNet with an interesting insight into the nature of research to policy processes being pursued by their user base, the cases also offer a deep understanding of the nature of Southern research and how Southern research can and does inform policy and practice. Output 1 Indicator 1 – Level of use of and satisfaction with GDNet research-orientated online services The M&E of the level of use of, and satisfaction with research-orientated online services combines GDNet’s monthly web statistics with data generated from the annual GDNet user base web survey. Target (January 2014): 20 % cumulative increase in use. Progress: GDNet website receiving an average of 40,103 visitors per month in 2013; equivalent to just over 16% year-on-year increase in use from 2011 to 2013, or a 32% cumulative increase in use. Exceeded Target. Target (January 2014): Maintain high level of satisfaction. Progress: High proportion of members continue to report satisfaction with each of GDNet's individual services and in most cases satisfaction has increased e.g. Knowledgebase of online papers rated by 69.4% of respondents and provision of online journals by 71% of respondents as extremely or moderately useful (an increase of 8% on Year 2 in both cases). Met Target. There is a significant difference between men and women in use of GDNet in terms of how often they visit the GDNet website. 56% of men visit the GDNet website at least once a month, compared to only 38% women. There does not seem to be any major differences in levels of satisfaction with GDNet services by male and female users, with the exception the GDNet YouTube channel. Of those who expressed an opinion, only 8.4% of men found it not at all useful, compared to 37.3% of women. Output 1 Indicator 2 – Level of use of and satisfaction with themed services The assessment of themed services is based on the triangulation between three sources of data: how GDNet Members report their use in terms of frequency and satisfaction (as captured in the survey), number of hits to the Thematic Windows pages, and how many Visits are recorded to the Thematic Landing Pages (taken from GDNet's webstats). Target (January 2014): 10% cumulative increase in use.
  5. 5. iv Year 3 Progress: 39.1% of survey respondents report using Thematic Windows occasionally or frequently, an increase of 3.2 points from the 2012 baseline. Average monthly hits for Thematic Windows pages are 10,491 in 2013 compared to 8,132 in 2012, an increase of 22%. Met Target. Target (January 2014): Maintain high level of satisfaction. Year 3 Progress: Thematic Windows are described as ‘extremely’ or ‘moderately’ useful by 42.8% of respondents, no change from 2012. Met Target. There is no significant difference between men and women in terms of use of, or satisfaction with, GDNet's Thematic Windows. Output 2 Indicator 1 – Researchers’ confidence and ability to communicate their research – immediately following capacity building effort In Year 3 the results of GDNet’s capacity building efforts were disaggregated by gender for the first time. And in response to DFID’s feedback in the Year 2 Annual Review, a method of assessing ability based on a systematic desk review of Policy Briefs was conducted Target (January 2014): Consistent 30-40 % increase in confidence at the end of each workshop regardless of starting point. Expect to see year-on-year improvement on value-added in workshops Year 3 Progress: Overall increase of 52%. By gender: 43% average increase for male participants, 74% average increase for female participants. Exceeded Target. Target (January 2014): Consistent 50–60 % increase in ability at the end of each workshop regardless of starting point. Expect to see year-on-year improvement on value-added in workshops Year 3 Progress: Overall increase of 57%. By gender: 48% average increase for male participants, 84% average increase for female participants. Met Target. Target (January 2014): Increase in score awarded to written policy briefs by experts blinded to whether policy brief was written pre or post training Year 3 Progress: Quality of participants' policy briefs increase, on average, from a score of 1.8 to 3.0 out of 6 after a GDNet capacity building effort (a 64% increase). If the criterion relating to length is excluded, the average increase is from 1.6 to 2.2 (a 39% increase). Met Target. Disaggregating the results by gender reveals that confidence and ability increases significantly more amongst women than men. Output 2 Indicator 2 – Researchers’ confidence and ability to communicate their research – sustainability of capacity building effort Perhaps unsurprisingly researcher response rates after 12-months are low and those that do respond tend to have a positive story to tell. However, the pledge cases presented are illustrative of sustainable change as a result of a capacity building effort. Target (January 2014): Rich portfolio of examples of researchers’ communications confidence and ability across a range of sectors and regions.
  6. 6. v Year 3 Progress: A third set of ‘pledge’ cases were developed in Year 3 from GDNet workshops 10-13. An additional set of five Year 2 pledges were followed up after 12 months. In total GDNet has generated 3 and 12- month pledge follow from 15 researchers. Met target. Several of the pledge cases developed point to researcher interaction with the policy domain as a direct result of GDNet support and hence can be captured and claimed under output 3 indictor 2. Output 3 Indicator 1 – GDNet ‘user base’ interaction In Year 3 GDNet has enhanced their focus on and support to user to user interaction. To do this they have invested in better understanding and graphically presenting GDNet’s relevant Web 2.0 and social media usage statistics as well as maintaining the user base interaction log template approach developed in Year 2. All of GDNet’s social media M&E data and metrics are now available online (including Q1 2014) and can be viewed and manipulated live at the following web address - http://public.tableausoftware.com/profile/#!/vizhome/TestVizGDNet/Blog Target (June 2014): 20% cumulative increase in user base interaction on spaces and platforms facilitated by GDNet Year 3 Progress: Indicators of user to user interaction indicate cumulative increases in interaction exceeding 20% - comments on blog postings increased by almost 50% between Year 2 and Year 3. Overall GDNet to user interaction has also either been maintained or increased - by end of 2013, the number of followers on Twitter reached 1941 followers (an increase of 541 followers throughout the year). This is an annual increase of 38.6%. Target met. Target (June 2014): Number of debates convened Year 3 Progress: Throughout Year 3, the log template illustrates that GDNet has facilitated and recorded 22 discrete user base interaction ‘events’ of varying types, scales and formats – conferences, workshops, meetings, knowledge product launches, and online courses. This has resulted in many hundreds of user to user ‘debates’. Target met. Output 3 Indicator 2 – Researchers interactions with the policy domain GDNet endeavours to support and facilitate a small number of interactions between Southern researchers and the policy domain. The log designed to capture the nature of this interaction illustrates that GDNet convened and facilitated two separate policy panels at GDNet Research Communications Capacity Building Workshops in Year 3. Target (June 2014): At least one research-policy debate per year in one region plus one online space Year 3 Progress: GDNet convened and facilitated two separate policy panels at GDNet Research Communications Capacity Building Workshops in Year 3. In addition, several of the Output 2 indicator 2 three and 12-month ‘pledge’ follow up statements presented indicate that GDNet has been the catalyst for significant researcher interactions with the policy domain. Target exceeded. Comments made in the previous DFID Annual Review in 2013 correctly identified that GDNet may be under- claiming their achievement under this output by not recognising researcher interaction with policy makers catalysed by GDNet through output 2 training and capacity building support. This is clearly the case in Year 3 too – several of the Output 2 indicator 2 three and 12-month pledge follow up descriptions presented in this report indicate that GDNet has been the catalyst for significant researcher interactions with the policy domain.
  7. 7. vi Output 4 Indicator 1 – Generation of best practice lessons Output 4 focuses on the expertise and experience generated by the GDNet team as experts in facilitating, convening and, knowledge brokering in the Global South. GDNet have established a process to routinely log and reflect on knowledge brokering best practice in order to generate lessons. Target (June 2014): Four GDNet best practice products Year 3 Progress: Reflective interviews and discussions have been used to generate source material. Content for four short learning publications has been drafted. Outline produced for Legacy Publication. On Track to Meet Target. Most of the Legacy Publication content is dependent on the four short publications and the M&E Report being produced. Peer response mechanism being discussed with partners e.g. e-discussion on Knowledge Brokers Forum with GDNet's publications as stimulus material. Output 4 Indicator 2 - Communication of lessons GDNet is responsible for communicating the best practice lessons generated through the reflective activities in order to share the knowledge and expertise they have developed with a wider audience. A record of the communication of the best practice lessons is held in a log. Target (June 2014): Dissemination plan for GDNet best practice products and other learning publications, for January 2014 to June 2014. Year 3 Progress: Dissemination plan has been produced pending confirmation of peer response mechanism described above. Several examples identified of communication and uptake of existing GDNet lessons since Year 2 M&E Report including citations and sharing at workshops. On Track to Meet Target. Due to GDNet program closure, dissemination plan focus has shifted to ensuring publications are sent directly to intermediaries and remain available online beyond June 2014.
  8. 8. 1 Introduction This document provides the annual progress report (Year 3) and update to GDNet’s Baseline and M&E Framework. The document is structured according to the GDNet logframe – with separate chapters from the Purpose-level down through Outputs 1 to 4. Each Chapter is structured as follows:  Year 3 summary – A clear summary statement of progress for each output indicator for comparison against the baseline, Year 1 and Year 2, and the relevant milestone. The statement is followed by a more detailed elaboration of the Year 3 M&E data generated and an analysis of its implications.  M&E approach summary – A very brief explanation of the approach and method adopted to generate the data for each output indicator. Readers should refer to the 2011 GDNet Baseline and M&E Framework for a more detailed account of how the M&E framework was designed and the methods adopted.  Data management plan – Setting out the on-going M&E roles and responsibilities within the GDNet team.  Evidence base – Providing detailed summaries of the relevant data used to support each output indicator – typically web statistics, web users survey, log templates, and interviews. Unless otherwise stated, Year 3 refers to the period January to December 2013. The GDNet M&E baseline was established in December 2010. GDNet’s M&E is reviewed and reported on an annual reporting cycle according to the calendar year January to December as follows: Logframe M&E Framework Baseline Baseline – est. December 2010 Milestone 1 (2011) Year 1 – January to December 2011 Milestone 2 (2012) Year 2 – January to December 2012 Target (2014) Year 3 – January to December 2013 with data and analysis provided up to April 2014 where available and feasible
  9. 9. 2 Outcome Level – Diverse research and policy audiences make better use of development research from the global south Progress against logframe indicators Indicator 1 - Southern researchers’ use of other southern research in own research Target (January 2014): At least 60% of Southern researchers consulted by the programme use Southern research in their own research to a moderate or great extent Progress: - 69% of respondents use Southern research to a great or moderate extent, an increase of 5 percentage points from Year 2. Exceeded Target by 9 percentage points. Notes: The Year 3 web survey provided a number of interesting insights into the nature and perceptions of Southern research by Southern users and how they tend to combine use of Northern and Southern research in a complementary manner. From analysis of the survey data, it is possible that women are less likely than men to distinguish between Northern and Southern research but the difference is not enough to be significant (given the smaller sample size of responses from women). Comments on M&E approach: The GDNet user base annual web survey represents a key component of the M&E approach and has been conducted annually since the baseline. Although the response rate is not particularly low for web surveys of its type – attempting to engage what is a diffuse and generally passive user group - reasons for the low response rate may relate to the general proliferation of online surveys (‘survey fatigue’) which for Year 3 cannot have been helped by the user base being informed that the programme is closing in June 2014 and hence having little incentive to contribute. Indicator 2 - Cases of knowledge into use in policy processes Target (January 2014): A portfolio of 20 up-to-date cases of knowledge into use in the policy process (5 per year) Progress: 21 cases developed, validated and updated throughout the life of the programme. Met Target. Notes: As well as providing GDNet with an interesting insight into the nature of research to policy processes being pursued by their user base, the cases also offer a deep understanding of the nature of Southern research and how Southern research can and does inform policy and practice. Comments on M&E approach: No cases were developed in Year 3 but existing cases were updated. Reflecting on the case development process over three years illustrates that GDNet has engaged a wealth of innovative, informed and highly motivated researchers. The cases GDNet developed highlight what makes Southern research unique – its practicality and innovativeness - something that more traditional Western research may do well to learn from.
  10. 10. 3 Indicator 1 – Southern researchers’ use of other southern research in own research GDNet user base web survey results – Surveyed using the same format as the baseline, Year 1 and Year 2, a number of questions in the web survey provide an indication of the level of use of Southern research. Further details and analysis of the Year 3 web survey are provided in Annex 4. Asked to what extent Southern researchers use Southern research in their own work, 69% of respondents claimed that Southern research was used to a great or moderate extent (See Annex 3 question 26). This represents a 5% increase on the Year 2, Year 1 and baseline figures which all approximately 64%. When asked to describe the type of research that they read, the most common response researchers gave is that they do not distinguish between Northern and Southern research (34%) (See Annex 3 question 27). However, the next biggest group (28%) believe they read more Northern than Southern research, followed by 25% who believe they read the same amount of Southern and Northern research. These results are very similar to the baseline, Year 1, and Year 2 results. GDNet Web Survey – Year 3 Introduction - The GDNet user base annual web survey represents a key component of the M&E approach and has been conducted annually since the baseline. It provides a range of data to support reporting against GDNet outcome, output 1 and output 2 indicators. This is the fourth time that the survey has been conducted, since the baseline was carried out in 2010. A detailed analysis of the Year 3 results is presented in Annex 4. Response Rates - 10,238 GDNet members were invited to participate in the Year 3 survey. Of this number, 271 (2.6%) bounced back, which can indicate a full mailbox or an out-of-date contact. This year the percentage of bounce-backs was significantly reduced (from 4.5% in Year 2). Using the link provided in the survey, or having done so on a previous Survey Monkey-powered survey, a further 45 recipients (0.4%) opted out. After removing these results, 9,922 GDNet members received the Year 3 survey and of this number 562 completed the survey, giving an overall response rate of 5.7%. Within this number, 84 did not complete the full questionnaire. Whilst the data is statistically robust in terms of absolute numbers of respondents, the response rate relative to total GDNet members is relatively small (5.4%). In Year 2, 13,292 GDNet members received the survey with a total number of 721 responses, giving an overall response rate of 6.5%. Response rates were 8.2% in Year 1 and 7.6% in the baseline year (2010). Although the response rate is not particularly low for web surveys of its type – attempting to engage what is a diffuse and generally passive user group - reasons for the low response rate may relate to the general proliferation of online surveys (‘survey fatigue’) which for Year 3 cannot have been helped by the user base being informed that the programme is closing in June 2014 and hence having little incentive to contribute. Disaggregation by gender There is no significant difference between men and women in terms of their use of Southern research but it is possible that women are less likely than men to distinguish between Northern and Southern research: 31% for men, compared to 41% for women. None of the female respondents said that they read only Southern or only Northern research, compared with 10 men (only Southern) and 5 men (only Northern). Given the smaller sample size of responses from women, to be significant, the difference between results would need to be at least 12.8%. Reviewing the data generated in the three years since the baseline was established it is clear that GDNet has exceeded its outcome level indicator 1 milestones throughout the course of the programme. Perhaps more interestingly, what emerges is a nuanced picture of use – significant use of Southern research by Southern researchers but perhaps no more significant than their use of Northern research. Noting the 5% increase in the use of Southern research in Year 3, overall there appears to be a slight increase in the use of Southern research over time. These are likely related to the slowly evolving nature and perception of Southern research. Similar to previous years, the Year 3 web survey provided a number of interesting insights into the nature and perceptions of Southern research by Southern users and how they tend to combine use of Northern and
  11. 11. 4 Southern research in a complementary manner (See Annex 3 question 28). The following responses illustrate this: Southern Research is more relevant to me than Northern Research. I read Northern Research when it has been done in the context of Southern Research. I am interesting in what others in my situation are doing to address problems in their countries. Research is either good or bad. There can be no discrimination between North and South....I have seen examples of abysmal research from the north and brilliant work from the south and vice versa. I believe that research are of two types: Quite useful research and Not so useful research. Research meant for/aimed at public good are useful; especially for countries of global south. So I personally read/refer those research those are meant for/aimed at public good, to enhance the wellbeing of people. In this backdrop, I do not distinguish whether it is Northern Research or Southern Research. HOWEVER, OF COURSE, EFFECTIVE SOUTHERN RESEARCH ARE OFTEN SEEM TO BE MORE ACCEPTABLE AS CASE STUDIES!!. I try to read the most important international journals in my field and they hold mainly Northern research. If I want to read the Southern ones, I have to try to look for them either in other journals or books... And so it goes... I usually look for good research design. There are some problems of substantive differentiation between the most and the less developed contexts, but design and modelling is not very affected by them. A theme that has emerged throughout GDNet’s M&E lifecycle is the potentially important niche that GDNet’s has contributed to filling– raising the profile of the best Southern research so that it is perceived as on a par with Northern research in terms of quality but also highlighting the feature of Southern research that defines it from more traditional Northern or Western research – it’s applied and practical nature, grounded in local contexts and addressing issues where there is strong demand or a clear evidence gap. It is clear from the responses provided by the GDNet user base that they are passionate about the quality, value and utility of research generated in the Global South and that they believe platforms such as GDNet provide an essential function in raising its accessibility and overall profile in a global research system where Southern research struggles to compete with better funded Northern research. M&E approach summary Purpose level indicator 1 draws on perceptions of use of Southern research gathered from the GDNet user base web survey results conducted annually. Data management plan Robbie Gregorowski / ITAD  On an annual basis – Repeat analysis of the annual GDNet user base web survey. Sherine Ghoneim / GDNet  On-going – Interpretation of the findings of the annual GDNet user base web survey and application to better understand and improve the services GDNet offers Evidence base See Annex 2 for the GDNet user base web survey questionnaire. See Annex 3 for a summary of the results of the Year 3 GDNet user base web survey.
  12. 12. 5 Indicator 2 - Cases of knowledge into use in policy processes Cases of knowledge into use in the policy process – follow up status Name Case Country Gender Baseline 2012 2013 2014 Wassam Mina Investment in GulfCooperation Countries UEA M Y Y Y SENT Rajarshi Majumder Obstacles for Out ofSchool Children India M Y SENT SENT SENT Sarah Ayeri Ogalleh Community tree planting Kenya F Y SENT SENT SENT Tohnain Nobert Lengha Obstacles to Diasbled Children in Education Cameroon M Y SENT SENT SENT David Rojas Elbirt Provision of'Watsan'products Bolivia M Y Y SENT SENT Pamela Thomas Decline ofimmunisation and maternal child health care service deliveryVanuatu F Y SENT SENT SENT Marcio da Costa Unequal educational opportunities Brazil M Y Y Y SENT Gohar Jerbashian Prevention ofmaternal and neo-natal mortality Armenia M Y Y SENT SENT Year 1 Brigitte Nyambo Integrated Pest Management Technology Ethiopia F Y SENT SENT Cecil Agutu Laws and policy in sugar sub-sector Kenya M Y SENT SENT Constancio Nguja Civil Society Advocacy Mozambique M Y Y SENT Davidson Omole Nigerian Stock Exchange Nigeria M Y SENT SENT Francesco Pastore Mongolian Youth education and employment Mongolia M Y SENT SENT Hasina Kharbhih Child Labour Rights India F Y Y SENT Martin Oteng-Ababio Digital Waste Management Ghana M Y SENT SENT Waweru Mwangi Card-less ATMsystem Kenya M Y SENT SENT Year 2 Dominique Babini Digital Open Access Argentina F SENT Y Gladys Kalema-Zikusoka Integrated biodiversity, health and community development Uganda F SENT SENT Harilal Madhavkan Traditional medicine industry India M SENT Y Nikica Mojsoska Blazevski Gender Wage Equality in Macedonia Macedonia F SENT SENT Yugraj Singh Yadava Low cost insurance for fishermen in Bangladesh India M SENT SENT First phone interview Followed up in? Throughout the life of the programme, GDNet has developed 21 cases of knowledge into use in policy processes – eight cases at the baseline, eight in Year 1, and five in Year 2. As well as providing GDNet with an interesting insight into the nature of research to policy processes being pursued by their user base, the cases also offer a deep understanding of the nature of Southern research and how Southern research can and does inform policy and practice. A deeper reflection on the new knowledge and learning that GDNet has generated from conducting the case development process over three years forms a central strand of GDNet’s legacy strategy, particularly the GDNet Legacy Document which is currently in draft and referenced under Output 4 indicator 2 – Communication of lessons. However, a brief synthesis of the 21 cases developed uncovers some interesting themes and initial conclusions about the nature of Southern research and it’s role informing policy and practice: Policy influencing factors  An emerging theme apparent in many of the cases across all three rounds is the extent to which Southern researchers set out to use research to solve distinct development challenges in a practical and pragmatic manner. Several of the themes which became the subject of the research in cases identified had very little in the way of a prior robust, empirical research or evidence-base. In Year 2, both the Uganda community health workers case where people lack basic access to effective contraception and the Bangladesh small-scale fisheries case where fishermen and the families lack affordable insurance illustrate how the application of action-research has provided workable solutions to very ‘tangible’ problems.  Drawing on Southern research addressing pressing developmental problems, it is apparent that nearly all cases are clearly ‘demand-led’. That is they all respond to the direct demand of the primary stakeholders for research to address a problem or constraint they face. These primary stakeholders, rather than simply being the subjects of the research, are engaged in a very participatory manner as
  13. 13. 6 stakeholder partners in the research process itself. Many of the cases illustrate the researchers go one step further and from the outset engage policy makers as well as the primary stakeholders from the outset. In this way policy makers are drawn into the research process as it develops.  Involving decision makers in from the outset is just one way in which Southern research tends to take a more innovative, informal and opportunistic approach to research dissemination. Southern researchers conducting ‘action research’ seem at ease with engage a wider range of stakeholders – local communities, politicians, civil servants, and the media (amongst others) throughout the research process. This is in direct contrast to Western research which tends to engage decision-makers at the end (if at all), disseminating research findings often through a relatively ‘formal’ and established dissemination and communication processes – presenting at conferences and disseminating research findings through formal journal peer-review processes.  Similarly the cases continue to highlight that Southern researchers use a wider and more innovative variety of tools to generate ‘evidence’ to support their research. Several cases highlight the use of documentary evidence (photos and video footage) combined with more traditional research methods such as key informant interviews to communicate the research to a wider audience of stakeholders. In this way Southern researchers explicitly draw in the media, civil society organisations, NGOs, and the private sector to put ‘pressure’ on decision-makers to legislate for change. Put simply, the cases highlight that some Southern researchers are particularly adept at translating their research findings into formats appropriate the meeting the needs of multiple stakeholder and audience groups and are adept at employing a wide range of formats, platforms and channels to broad sets of stakeholders. GDNet’s role and contribution  GDNet has played a simple but critical role in sharing innovative research, connecting researchers in one region or country with other researchers so that knowledge and learning in one context can effectively be transferred and replicated in similar contexts elsewhere.  The use of evidence in the most appropriate format – using photos and videos combined with more formal research techniques such as surveys and interviews - is one of the areas where Southern research can be considered more effective and advanced than more traditional Western research – Southern research appears better at bringing in innovative technology such as the use of visual and social media to generate more substantial impact. There is a potential role for a successor to GDNet in sharing the lessons and experience of how best to combine the two forms of research as well as potentially providing training in the use of more innovative research and documentation techniques for Southern researchers – building the capacity of Southern researchers to present their research in the most appropriate format for a particular stakeholder audience.  Reflecting on the case development process over three years illustrates that GDNet has engaged a wealth of innovative, informed and highly motivated researchers. The simple process of producing the cases has provided a showcase for a number of these researchers and their policy influencing skills. In their legacy document and communications, GDNet should use the experience of facilitating this process to express their learning on the key success factors in producing effective, policy- influencing Southern research. They should highlight what they have learned about makes Southern research unique – its practicality and innovativeness - something that more traditional Western research may do well to learn from.
  14. 14. 7 Output 1 - Southern research better informed by current ideas and knowledge Progress against logframe indicators Indicator 1 - Level of use of and satisfaction with GDNet research-orientated online services Target (January 2014): 20 % cumulative increase in use Progress: GDNet website receiving an average of 40,103 visitors per month in 2013; equivalent to just over 16% year-on-year increase in use from 2011 to 2013, or a 32% cumulative increase in use. Exceeded Target. Target (January 2014): Maintain high level of satisfaction Progress: High proportion of members continue to report satisfaction with each of GDNet's individual services and in most cases satisfaction has increased e.g. Knowledgebase of online papers rated by 69.4% of respondents and provision of online journals by 71% of respondents as extremely or moderately useful (an increase of 8% on Year 2 in both cases). Met Target. Notes: There is a significant difference between men and women in use of GDNet in terms of how often they visit the GDNet website. 56% of men visit the GDNet website at least once a month, compared to only 38% women. There does not seem to be any major differences in levels of satisfaction with GDNet services by male and female users, with the exception the GDNet YouTube channel. Of those who expressed an opinion, only 8.4% of men found it not at all useful, compared to 37.3% of women. Comments on M&E approach: The M&E of the level of use of, and satisfaction with research- orientated online services combines GDNet’s monthly web statistics with data generated from the annual GDNet user base web survey. Indicator 2 - Level of use of and satisfaction with themed services Target (January 2014): 10% cumulative increase in use Progress: 39.1% of survey respondents report using Thematic Windows occasionally or frequently, an increase of 3.2 points from the 2012 baseline. Average monthly hits for Thematic Windows pages are 10,491 in 2013 compared to 8,132 in 2012, an increase of 22%. Met Target. Target (January 2014): Maintain high level of satisfaction Progress: Thematic Windows are described as ‘extremely’ or ‘moderately’ useful by 42.8% of respondents, no change from 2012. Met Target. Notes: There is no significant difference between men and women in terms of use of, or satisfaction with, GDNet's Thematic Windows. Comments on M&E approach: The assessment of themed services is based on the triangulation between three sources of data: how GDNet Members report their use in terms of frequency and satisfaction (as captured in the survey), number of hits to the Thematic Windows pages, and how many Visits are recorded to the Thematic Landing Pages (taken from GDNet's webstats).
  15. 15. 8 Indicator 1 - Level of use of and satisfaction with GDNet research-orientated online services 0 5000 10000 15000 20000 25000 30000 35000 40000 45000 Average monthly visitors 2011 2012 2013 Growth in use of GDNet's website All visitors Southern visitors Level of use – The level of use of GDNet’s research-orientated online services continues to rise. At the headline level, GDNet received an average of 40,103 visitors per month in Year 3. This represents a 16% increase in the number of average monthly visitors over Year 2 and exceeds by over 4,500 visits per month the logframe- defined milestone of a 20% cumulative increase in use (35,593 visitors) 1 . This is despite the website being redesigned in December 2013, reducing the traffic significantly for that month. However, the percentage of visitors coming from the Global South has fallen from 32% in Year 2 to 24% in Year 3 2 . It is unclear why the percentage of Visitors coming from the Global South between Year 1 and Year 3 has fallen, but there are several technical reasons why it is dangerous to draw conclusions from the number of recorded Visitors:  Inaccurate data: generating location statistics based on IP address (which is how visitors' location is determined) is notoriously unreliable.  The influence of India and China: in January 2013, the visitors recorded from India represented about a quarter of all visitors from the South that month, with China the second highest source of visitors. 3  Shared IP addresses: Visitors are identified by the IP address of their internet connection. However, multiple devices can share the same IP address, for example a whole university might share a single IP address. In the South, there tends to be greater use of shared computers and internet cafes which would increase this effect on webstats.  Access from different types of devices: a single user could access the GDNet portal from their tablet, work computer, home laptop and smartphone and in each case a different IP address be used and therefore be counted as four visitors. 1 The 20% cumulative increase target for Year 3 was calculated in the same manner as compound interest i.e. by adding 10% to the projected Year 2 monthly average which was assumed to be 10% higher than that recorded for Year 1, 2 Established from users’ IP addresses. 3 India and China are the two largest sources of GDNet's Southern traffic. e.g. in January 2013 about 25% of the southern visitors came from India alone. So it means that if something happens to Indian or Chinese internet connections (including perhaps internet censorship) and the visitors reduce from there it will have a large impact on % of visitors from Global South.
  16. 16. 9 To provide a counterpoint to this, some analysis has been done of the number of Visits made to the GDNet website. Although looking at visits does not address the problem of inaccuracy over geographical location it does respond to the issues of shared IP addresses and multiple devices. The percentage of Visits coming from the Global South is 41% for Year 3 and in some months over half of the visits made to the site came from the South. Comparing N/S ratio for Visitors and Visits in 2013 24% 76% 41% 59% Southern Northern Southern visits to GDNet website in 2013 Southern visitors to GDNet website in 2013 0 10,000 20,000 30,000 40,000 50,000 60,000 70,000 80,000 90,000 Website visits January February March April May June July August September October November December Visits to GDNet in 2013 each month From the South From the North
  17. 17. 10 A further explanation of the drop may relate to GDNet’s increased social media efforts throughout 2013 which may have engaged a higher proportion is Northern / Western users who are more familiar with and have better access to social media channels. Considerable progress has taken place in terms of the average monthly document downloads from the GDNet KnowledgeBase (KB). The average monthly document downloads were approximately 4,000 at the baseline, 11,900 during Year 1, 12,275 during Year 2, and increased to 18,504 during Year 3. This is a pleasing statistic as it represents an increase in the ‘quality of use’ of GDNet’s online services. As referenced in the Year 1 report, quality of use (developing a core of ‘involved’ users and focussing on their uptake of knowledge) has been a focus of the GDNet team throughout Year 2 and Year 3. With this in mind, the GDNet team have endeavoured to develop a small set of ‘quality of use’ indicators which will be followed up on in Year 3. These include the total research paper abstract views which in Year 3 totalled 709,378 (compared to 333,162 in Year 2) and averaged 59,115 per month (compared to 27,764 per month in Year 2). GDNet should be commended for the progress they have made here – more than doubling the number of abstract views is an excellent indicator of use and illustrates the extent to which the GDNet team has informed the ‘behaviour’ of its user base – providing an essential service which encourages users to ‘actively’ engage with the site. Abstractviews Documentdownloads ThematicWindowshits 0% 10% 20% 30% 40% 50% 60% Change from 2012 monthly average GDNet online services Increases in use of GDNet Knowledgebase Taken at face value these web stats results are pleasing as they indicate that there is high absolute usage of the GDNet research paper abstracts (a key GDNet value-added service) and that almost half of all GDNet visitors use the Knowledge Base – a quality of use indicator. Several of the comments provided in the web survey appear to support the conclusion that GDNet’s user base have grown to depend on the services GDNet offers: I always resort to GDNet when I source for materials to read to prepare a proposal or write up a paper or article. Indeed, oftentimes GDN is the first website that I approach with confidence for such materials. It is only through the GDN website that I can access the Jstor and the Muse collection. I recall in January 2014, I had to give a talk to legislators from Nigeria who were in Lincoln University (UK) for a conference. Some important articles I wanted to consult for the paper were in Jstor. But because GDNet was shut down, I had a hard time accessing the articles. It was a few days to the event that I got alternative access. The stress was high. This is the only channel that Southern Researchers can feel that they are being assist in increasing their capacity and capability in undertaking and sharing their research. It is a portal that we can call it home.
  18. 18. 11 Very important. The Southern Researchers need a platform to share their research and connect with other researchers. This platform provided that kind of a platform. GDNet has played a pivotal role in connecting the researchers to the literature, opportunities and funds. It must continue to serve the world of research. This portal is an important initiative sharing and encouraging researchers to conduct and use these materials for their own work. This is specifically important as people from South get ideas and challenges from their own context. I felt like losing access to a resourceful library. Certainly, it is going to cause a big handicap to the researchers, students in the developing countries. GDNet must reconsider its plan. The survey and webstats appear to give contradictory messages about the demand for access to free journals. As highlighted above, some Southern researchers appear to be dependent on GDNet as a source of peer- reviewed research and the access GDNet provides (together with the news of funding opportunities) is considered the most useful service GDNet offers (see table below). Furthermore, Southern respondents reported that the second biggest challenge Southern researchers face is lack of access to journals and data and nearly three-quarters of them said that it was very important that free access to e-journals should continue (see Annex 3). However the volume of actual use via the GDNet website is declining. In 2012, researchers made use of the online journals 84 times each month on average. In 2013, this monthly average dropped to 69. A further 237 visits were made each month, on average, to GDNet's Free Online Journals gateway page. Two contributing factors to the 2012 levels of use not being maintained, are that GDNet's resources were directed towards planning for closure in 2013 and away from promoting the online services; and the online services were unavailable during December while the website was re-launched. The number of recipients of GDN Newsletters continues to rise and the rate has picked up on Year 2: an average of 22 new recipients per month receive the Research into Focus newsletter (up from an average of 15 in Year 2) and an average of 23 new recipients receive the Funding Opportunities newsletter (down from an average of 17 in Year 2). Although this is still not as high a rate as in Year 1, it is not deemed too significant as GDNet’s strategy focussed on quality involved usage is not based on newsletter recipients who, to a certain extent, represent a less involved means of interaction with users. The Gender Audit carried out in 2012 however, highlighted that there is a gender difference in interest in email newsletters with GDNet's female users (typically 25% of GDNet's membership) finding them useful while the male users prefer social media and accessing the website directly. The GDNet Team were quick to embrace technological progress in user engagement in Year 2 and have continued this in Year 3. This is illustrated by the fact that GDNet now maintains several complementary platforms alongside the website – a blog, Twitter feed, YouTube channel and LinkedIn page 4 – to support interactive user engagement through cross-posting. The implications of maintaining multiple platforms are discussed in the next section drawing on the web survey result assessing levels of satisfaction. Disaggregation of responses on use, by gender There is a significant difference between men and women in use of GDNet in terms of how often they visit the GDNet website. 56% of men visit the GDNet website at least once a month, compared to only 38% women. Perhaps as a consequence, men tend to use research found on GDNet's website more often than women: 52.2% of male respondents report using it in their work at least once every six months, compared to 34.1% of women. Thematic Windows were introduced to reduce time searching the site (lack of time being a potential barrier to use identified in GDNet's Gender Audit in Year 2) and the uptake among women is good, with 44% using them frequently or occasionally. The indications are that men are more likely to use GDNet's social media 4 GDNet blog - http://gdnetblog.org/ GDNet Twitter – https://twitter.com/Connect2GDNet GDNet YouTube – http://www.youtube.com/user/gdnetcairo GDNet LinkedIn - http://www.linkedin.com/company/gdnet
  19. 19. 12 tools (Twitter and YouTube) than women but there is not a big enough sample size from female members to be certain. Level of satisfaction – Satisfaction with GDNet’s research orientated online services is assessed based on the web survey findings, in particular question 13 which asks GDNet users to rate GDNet services according to their usefulness. A summary of the Year 3 results with the Year 2 results in brackets is provided below: Answer Options Extremely Useful Moderately Useful Somewhat Useful Not at all Useful Lack Access to Service Not aware of service Funding Opportunities newsletter 42.62% (36.9%) 28.48% (30.2%) 15.40% (20.7%) 4.43% (3.9%) 1.90% (1.5%) 7.17% (7.8%) GDNet newsletters 32.20% (27.8%) 41.15% (40.4%) 18.98% (21.8%) 1.92% (2.8%) 0.85% (1.1%) 4.90% (6.1%) GDNet Knowledgebase - Online papers 32.39% (27.8%) 36.96% (33.6%) 18.26% (22.9%) 3.91% (3.3%) 1.30% (2.7%) 7.17% (9.7%) GDNet Knowledgebase - Researchers' profiles 16.89% (15.1%) 34.00% (29.2%) 32.89% (34.3%) 6.22% (6.7%) 1.78% (3.4%) 8.22% (11.3%) GDNet Knowledgebase - Organisations' profiles 16.22% (13.3%) 31.31% (27.6%) 34.01% (36.5%) 6.53% (7.2%) 2.70% (2.5%) 9.23% (12.9%) Online journals 41.56% (35.4%) 29.44% (27.6%) 14.29% (16.9%) 2.16% (4.6%) 2.38% (3.8%) 10.17% (11.7%) Regional window portals 15.40% (15.1%) 29.91% (30.2%) 29.24% (28.0%) 7.14% (6.1%) 1.79% (3.1%) 16.52% (17.5%) Thematic Windows 15.92% (14.9%) 26.91% (27.8%) 28.48% (26.3%) 8.30% (7.8%) 2.47% (3.0%) 17.94% (20.2%) GDNet Feeds (RSS or email) 8.64% (10.5%) 23.86% (22.2%) 30.45% (28.5%) 12.50% (11.4%) 4.09% (4.7%) 20.45% (22.7%) GDNet YouTube channel 3.66% (5.0%) 14.87% (14.7%) 26.32% (25.3%) 18.54% (16.5%) 5.26% (6.8%) 31.35% (31.7%) GDNet Twitter 3.96% (4.8%) 12.12% (13.0%) 27.74% (23.5%) 19.35% (18.9%) 6.29% (7.4%) 30.54% (32.3%) GDNet Blog 6.05% 15.12% 30.00% 15.81% 5.12% 27.91% GDNet is aiming to maintain a high level of satisfaction as defined by the output 1 indicator 1 milestone (January 2014). The web survey results, particularly those focussed on quality involved usage, suggest that this has been achieved, if not exceeded. For example, the Knowledgebase online papers were rated extremely useful by 32.39% (increase of 4.6% on Year 2) and moderately useful by a further 36.96% (increase of 3.4% on Year 2) of respondents to this year’s survey. Access to online journals was rated extremely useful by 41.56% (increase of 6.2% on Year 2), and moderately useful by 29.4% (increase of 1.8% on Year 2). Question 14 in the web survey asked recipients to assess which of GDNet’s core services they felt it was important to continue to be provided. Perhaps unsurprisingly, given the recipients were informed in the survey introduction that GDNet would be closing in June 2014, nearly all of GDNet’s core services were deemed by the respondents to be either very or quite important. The results are presented in the table below: Very important Quite important Not important Access to JSTOR/Project Muse online journals (if you have already been eligible to free access) 73.4% 19.2% 7.3% A searchable database of researchers for you to make contact with 54.9% 38.1% 7.0% A public webpage for you to share your contact details, research interests and papers 52.1% 36.1% 11.8% Opportunity to participate in online discussions 37.3% 44.9% 17.7% Information on funding for southern researchers 66.7% 26.3% 6.9% Online toolkits and guides on how to communicate research 55.2% 36.0% 8.8% All six core services are well valued by respondents; perceived as the least important was the opportunity to participate in online discussions (82.2% considered this ‘very important’ or ‘quite important’) – an indicator of very engaged usage perhaps not representative of GDNet’s ‘average’ user. However, four out of six core services (access to online journals, researchers database, funding opportunities, and online toolkits and guides) were considered to be important by over 90% of respondents. These findings may be useful to other
  20. 20. 13 programmes / knowledge service in the future that have a mandate to support Southern researchers with their knowledge needs and raised profile. The Year 3 web survey again provides some interesting insights into GDNet’s newer social media / web 2.0 tools and platforms (GDNet’s feeds, YouTube channel, Twitter, and Community Groups) which have noticeable lower usefulness ratings. In Year 3 GDNet has invested in better understanding the use and value of these tools and platforms. A report drafted by GDNet’s social media lead illustrating GDNet’s social media ‘portfolio’ in Year 3 is presented in Annex 7. Contrasting the web survey results with the social media usage reporting presents some interesting findings:  GDNet’s portfolio of social media tools and platforms are seen as either extremely or moderately useful by approximately a fifth to a quarter of web survey respondents, a relatively low general satisfaction / usefulness rating broadly in line with Year 2.  However, the detailed social media monitoring and reporting presented in Annex 7, demonstrates that the use of the tools and platforms has steadily increased from Year 2 to Year 3, albeit from a relatively low base. As explained in the Year 2 report, the ‘90 – 9 – 1 rule’ is generally accepted by knowledge brokering experts whereby 90% of knowledge network or platform members are passive and engage ad hoc and periodically, 9% of users are passive but visit regularly, and only 1% can be considered engaged users who constructively contribute to the platform or network. Therefore it is sensible of GDNet to develop a strategy that aims to engage the 9% and 1% of regular users. The role GDNet’s Web 2.0 tools and services play with this smaller but active ‘core’ user base is what is important as well the how the tools allow GDNet to engage with different users in different contexts and at different times – at conferences and training events etc. The social media report illustrates that during Year 3 GDNet produced 88 blog posts, 97 videos and 1582 tweets. These blog posts received 14,717 views, broadly in line with Year 2 viewing figure of 15,916 despite the GDNet team attending significantly fewer events in Year 3 which tends to be the catalyst for blog viewing. Disaggregation of responses on satisfaction, by gender There do not seem to be any major differences in levels of satisfaction with GDNet services by male and female users. There were fewer responses to these questions than to some others in the survey, so the differences that are observed tend not to be big enough to be considered significant, with the exception the GDNet YouTube channel. Of those who expressed an opinion, 8.4% of men found it not at all useful, compared to 37.3% of women. However, there does appear to be a gap in awareness for some services, especially Twitter and the GDNet blog, however the respondents give inconsistent answers to awareness between the questions on frequency of use and satisfaction so this would need further investigation 5 . M&E approach The M&E of the level of use of, and satisfaction with research-orientated online services combines GDNet’s monthly web statistics with data generated from the annual GDNet user base web survey. Data management plan Karim Sobh/Dina Abou Saada  Design, testing and monthly production of standardised GDNet web statistics report. Shahira Emara  Monthly collection and quality assurance of web statistics Robbie Gregorowski  On an annual basis – assess level of use of research-orientated online services over previous 12 months through analysis of web statistics and through the annual GDNet users web survey, and report on findings against baseline and lesson learnt to GDNet. 5 e.g. Q13: 26.3% of male respondents are unaware of GDNet's Twitter account compared with 39.8% of female respondents but in Q12 this is 17.1% and 24.8% respectively. In Q13, for the GDNet blog the figures are 22.7% of men, compared to 40.7% of women and in Q12, 14.8% and 25.9%.
  21. 21. 14 Evidence base A detailed explanation of the process used to generate the web statistics and GDNet user base web survey can be found in the Baseline and M&E Framework report.  Annex 2 provides an outline of the Year 3 web survey questionnaire.  Annex 4 presents the results and a brief analysis of the Year 3 web survey responses.
  22. 22. 15 Indicator 2 - Level of use of and satisfaction with themed services In July 2011, GDNet piloted a beta version of 11 themed services and launched the full set of 23 themed services in November that year. Year 2 was therefore the first year for which a baseline could be set for satisfaction and use of the full set of Thematic Windows, and the figures presented below compare the measures for 2012 and 2013. Use of Thematic Windows This can be understood based on three sources of data: how GDNet Members report their use in terms of frequency (as captured in the survey), number of hits to the Thematic Windows pages and how many Visits are recorded to the Thematic Landing Pages (taken from GDNet's webstats). How often do GDNet Members use the Thematic Windows? Through the survey, GDNet members report that they are using the Thematic Windows more frequently than in 2012. In 2013, 39.2% of respondents used them Frequently or Occasionally, compared to 35.8% in 2012. Frequently Occasionally Rarely Never Lack Access to Service Not aware of service 2013 8.8% 30.3% 25.9% 20.9% 2.4% 11.7% 2012 7.8% 28.0% 28.2% 20.8% 1.0% 14.1% The survey responses suggest that some members may not understand what is meant by Thematic Windows as they are free to the public to use via the GDNet website so no member should be reporting that they lack access to them as a service. How much use is made of the Thematic Windows? GDNet's website statistics record the number of hits for the Thematic Windows and show how many times the Thematic Window landing pages are visited 6 which gives a measure of the volume of use each month. Average monthly hits for Thematic Windows pages are 10,491 in 2013 compared to 8,132 in 2012, an increase of 22%. The average monthly visits to the Thematic Windows is 10,483 for 2013, or about 20% of the visits to the GDNet website, but as the chart below shows, this varies greatly from month to month. 6 This is based on visits made to the individual Thematic Window landing pages e.g. http://www.gdnet.org/~themes/Agriculture
  23. 23. 16 0 10000 20000 30000 40000 50000 60000 70000 80000 90000 Jan Feb March April May June July Aug Sept Oct Nov Dec 2013 NumberofVisits Rest of Site Thematic Windows Some Thematic Windows are more popular than others, although as the chart below illustrates, interest in themes appears to change from year to year 7 . In the Year 2 Report it was suggested that GDNet might rationalise the number of Thematic Windows from 23 to 10 using webstats as an input to that decision. However, the chart shows the danger of drawing conclusions from a single set of figures. From comparing the traffic generated by the Thematic Windows in 2013 and 2012 it is clear that the Globalization and Trade window is the 2nd most popular window in 2013, but in 2012 was ranked 15th and had GDNet gone ahead with reducing its Thematic Windows, could well have been one of those to be removed. By contrast, Urban Development and the Global South was 8th most popular in 2012 and may have been retained, but in 2013 it has moved down to 14th position, as other Thematic Windows have increased in use. 7 The analysis of traffic to the individual Thematic Windows is based on the data available, in this case, the metric is hits. This is considered a weak metric in comparison to visits, which is a further reason to be cautious about making decisions on website content based on webstats.
  24. 24. 17 0 1,000 2,000 3,000 4,000 5,000 6,000 7,000 8,000 9,000 10,000 Agriculture G lobalization and Trade Education and Training Inform ation & C om m unications Technology (IC T) Poverty & Inequality H ealth D evelopm ent Finance & Aid Effectiveness Environm ent and C lim ate C hange M acroeconom ics and Econom ic G row th G ender G overnance Energy D om estic R esource M obilization U rban D evelopm ent and the G lobal South Law and R ights Labor & Social Protections M igration Entrepreneurship International Affairs Innovation Private Sector D evelopm ent Transport W ater Numberofhits 2013 2012 How useful do GDNet Members find the Thematic Windows? The web survey indicates that the number of GDNet Members who find the Thematic Windows useful is about the same as in 2012. 43% of GDNet Members find the thematic windows either extremely or moderately useful (as in 2012) with a further 28.5% finding them somewhat useful (up slightly from 2012). There is further evidence of improved awareness by users. 27% of users were unaware of the Thematic Windows in Year 1. This figure fell to 20% in Year 2 and further still to 17.9% in 2013. Extremely Useful Moderately Useful Somewhat Useful Not at all Useful Lack Access to Service Not aware of service 2013 15.9% 26.9% 28.5% 8.3% 2.4% 17.9% 2012 14.9% 27.8% 26.3% 7.8% 3.0% 20.2% M&E approach Level of use themed services was monitored in Year 3 using web usage statistics by triangulating the data generated from thee sources of data: how GDNet Members report their use in terms of frequency and satisfaction (as captured in the survey), number of hits to the Thematic Windows pages, and how many Visits are recorded to the Thematic Landing Pages (taken from GDNet's webstats). Data management plan Shahira Emara  Day-to-day – management and facilitation of themed services including generating web statistics on the level of use (reporting monthly but analysed quarterly). Robbie Gregorowski  On an annual basis - assess thematic service satisfaction through the annual GDNet users web survey as well as designing short web survey targeted at thematic micro-site users Evidence base  Annex 2 provides an outline of the Year 2 web survey questionnaire.  Annexes 3 and 4 presents the results and a brief analysis of the Year 2 web survey responses.
  25. 25. 18 Output 2 - Researchers better able to communicate their research to policy Progress against logframe indicators Indicator 1 – Researchers’ confidence and ability to communicate their research – immediately following capacity building effort Target (January 2014): Consistent 30-40 % increase in confidence at the end of each workshop regardless of starting point. Expect to see year-on-year improvement on value- added in workshops Progress: Overall increase of 52%. By gender: 43% average increase for male participants, 74% average increase for female participants. Exceeded Target. Target (January 2014): Consistent 50–60 % increase in ability at the end of each workshop regardless of starting point. Expect to see year-on-year improvement on value-added in workshops Progress: Overall increase of 57%. By gender: 48% average increase for male participants, 84% average increase for female participants. Met Target. Target (January 2014): Increase in score awarded to written policy briefs by experts blinded to whether policy brief was written pre or post training Progress: Quality of participants' policy briefs increase, on average, from a score of 1.8 to 3.0 out of 6 after a GDNet capacity building effort (a 64% increase). If the criterion relating to length is excluded, the average increase is from 1.6 to 2.2 (a 39% increase). Met Target. Notes: Disaggregating the results by gender reveals that confidence and ability increases significantly more amongst women than men. Comments on M&E approach: In Year 3 the results of GDNet’s capacity building efforts were disaggregated by gender for the first time. And in response to DFID’s feedback in the Year 2 Annual Review, a method of assessing ability based on a systematic desk review of Policy Briefs was conducted Indicator 2 – Researchers’ confidence and ability to communicate their research – sustainability of capacity building effort Target (January 2014): Rich portfolio of examples of researchers’ communications confidence and ability across a range of sectors and regions. Progress: A third set of ‘pledge’ cases were developed in Year 3 from GDNet workshops 10- 13. An additional set of five Year 2 pledges were followed up after 12 months. In total GDNet has generated 3 and 12-month pledge follow from 15 researchers. Met target. Notes: Several of the pledge cases developed point to researcher interaction with the policy domain as a direct result of GDNet support and hence can be captured and claimed under output 3 indictor 2. Comments on M&E approach: Perhaps unsurprisingly researcher response rates after 12- months are low and those that do respond tend to have a positive story to tell. However, the pledge cases presented are illustrative of sustainable change as a result of a capacity building effort.
  26. 26. 19 Indicator 1 – Researchers’ confidence and ability to communicate their research – immediately following capacity building effort During the Year 3 period GDNet conducted 3 training events with successful completion participant numbers as follows: Workshop 10: GDNet-AERC policy brief training, Tanzania 21 participants (18 men, 3 women) Workshop 11: GDN Awards and Medals Finalists Philippines 18 participants 8 (7 men, 11 women) Workshop 12: GDNet-AERC policy brief training Kenya 16 participants (13 men, 3 women) Total 57 participants (38 men, 17 women) A summary of the ‘before and after’ confidence and ability scores generated across the three research communications capacity building events conducted by GDNet during Year 3 is provided below. The scores have been disaggregated by gender and show a striking difference in mean average increases in confidence and ability between men and women. As the chart illustrates, these differences are observed at both types of workshop: Policy Brief (PB) and Awards & Medals (A&M). 0 20 40 60 80 100 120 Average % increase PB: Confidence PB: Ability A&M: Confidence A&M: Ability Increases in self assessment scores of confidence and ability differ according to gender Female Male The average increases in confidence and ability self assessment scores for women are markedly higher than those of the men. Closer analysis of individual scores reveals that women attending GDNet workshop participants tend to rate their confidence and ability lower than the male participants do, but their scores at the end of the workshops are as high, if not higher than those of the men. There are many reasons why this difference might exist:  male participants may have over-confidence in their research communication skills going into the workshop and female participants have a better awareness of their confidence and ability  female researchers may come into the workshop with less confidence in their abilities and this is disproved during the workshop i.e. they realise they are better than they thought  GDNet's workshops tend to be led by female facilitators and this may have a positive influence on the female participants' learning 8 Of these 18, there were only 15 usable self-assessments.
  27. 27. 20 Workshop 10 GDNet-AERC Policy Brief Workshop 7-9 June, 2013 Arusha, Tanzania Mean average before Mean average after Mean average increase Confidence 2.7 4.3 1.7 Women only 2.2 4.3 2.1 Men only 2.7 4.3 1.6 Ability 2.5 4.3 1.8 Women only 2.3 4.5 2.2 Men only 2.6 4.3 1.7 Workshop 11 Awards & Medals Presentation Skills June 17-18, 2013 Manila, Philippines Mean average before Mean average after Mean average increase Confidence 3.5 4.5 1.0 Women only 3.3 4.5 1.2 Men only 3.7 4.4 0.7 Ability 3.2 4.4 1.2 Women only 2.8 4.4 1.6 Men only 3.5 4.4 0.9 Workshop 12 GDNet-AERC Policy Brief Workshop 6-8 December, 2013 Nairobi, Kenya Mean average before Mean average after Mean average increase Confidence 2.5 4.3 1.8 Women only 2.1 4.5 2.3 Men only 2.6 4.3 1.6 Ability 2.6 4.3 1.8 Women only 2.2 4.6 2.3 Men only 2.6 4.3 1.6 2013 results produce average before and after confidence and ability figures as follows:  Average before confidence score 2.9  Average after confidence score 4.4  Year 3 - average increase in confidence 1.5 (52%)  Average before ability score 2.8  Average after ability score 4.3  Year 3 - average increase in ability 1.6 (57%) The average increase in confidence (52%) exceeds the Year 3 target of 30-40% and the average increase in ability (57%) comfortably meets the Year 3 target of 50-60%. However the increases for the GDNet/AERC Policy Brief workshops greatly exceed the targets, being 69% for confidence and ability. As was observed in the Year 2 report, the annual A&M workshop has lower average increases in confidence and ability compared to the other workshops (Workshop 11 above). It has been noted previously that this is likely to be due to the nature of the A&M finalists who participate in the workshop: their self assessments of confidence and ability are comparatively high going into the workshops, they are training in order to compete for a prize at the GDN Conference and the focus of the training is different to the Research to Policy workshops delivered for AERC researchers. By the end of 2013, GDNet had delivered 12 capacity building workshops for which participants' self- assessment scores are available, which presents the opportunity to undertake comparisons between years.
  28. 28. 21 Year 1 did not include an A&M workshop, making its annual average increases higher than they would otherwise be, so to establish if there is any year-on-year improvement, the A&M workshop results have been excluded from the annual averages for this analysis. As the chart below illustrates, there is a clear trend of improvement from Baseline to December 2013, with marginal increase in Years 1 and 2 on the Baseline and a major increase by Year 3. This "performance improvement curve" is a common phenomenon: limited results initially as a team tries a new approach and refines the model based on learning, followed by a sudden rise. Average increases for GDNet Research to Policy Workshops: 2010-2013 Baseline Year 1 Year 2 Year 3 Confidence 46% 48% 52% 69% Ability 46% 52% 48% 69% 0 10 20 30 40 50 60 70 Baseline Year 1 Year 2 Year 3 Average increases (%age) without A&M workshops Confidence Ability In producing the results for Year 3 and comparing them to the results of earlier years, some errors were observed in the mean averages for previous workshops. The self-assessment scores from the first workshop had 10 statements each for Confidence and Ability, but over time the number and nature of the statements were tailored for each workshop. The template for analysing the results of the first workshop was applied to all the workshops without adjusting the formulae to reflect the different number of statements. Where this has happened it has meant the average increases for previous years are different than they were reported but do not affect the overall result i.e. where GDNet was reported as having met its targets in Years 1 and 2, this is still the case. The formulae have now been corrected to produce the average increases below. Average before and after confidence and ability figures for all workshops:  Average before confidence score 2.9  Average after confidence score 4.4  Year 3 - average increase in confidence 1.5 (52%)  Year 2 - average increase in confidence 1.2 (43%)  Year 1 – average increase in confidence 1.4 (48%)  Baseline – average increase in confidence 1.2 (39%)  Average before ability score 2.8  Average after ability score 4.3
  29. 29. 22  Year 3 - average increase in ability 1.6 (57%)  Year 2 - average increase in ability 1.1 (41%)  Year 1 – average increase in ability 1.4 (52%)  Baseline – average increase in ability 1.1 (38%) Participants' Policy Brief Analysis In the annual review report for Year 2 of GDNet, DFID recommended strengthening the evaluation of the capacity building training by, for example, testing written material produced before and after training. For Year 3, a method of external review of participants' policy briefs was designed and piloted to complement the established self-assessment process (see Annex 9 for full details of method, results, analysis and reflections on this M&E approach). Two external consultants with expertise in research communication for policy audiences reviewed the "before" and "revised" policy briefs of 18 participants in the GDNet/AERC 2013 Policy Brief workshops using a six-point checklist of absolute (Yes/No) statements, supplied by ITAD, based on the advice given to participants by GDNet on the elements of a good policy brief: 1. Length: Is the policy brief a suitable length? 2. Language: Is the policy brief written for a non-specialist audience? 3. Evidence: Are the arguments supported by research evidence? 4. Readability: Is the policy brief written in an easy-to-read format? 5. Policy-oriented: Is the brief targeted at a policy audience? 6. Persuasive: Does the policy brief have a consistent, single argument? All but one of the criteria are subject to a degree of interpretation so the reviewers were asked to score the briefs individually, and then discuss and agree the final scores between them. The consultants also provided comments about each pair of policy briefs to explain their scoring and highlight the degree of change between the versions which may not be reflected in the score, given the use of absolute statements. Summary of average scores from Policy Brief review Average score before (out of 6) 1.8 Average score after (out of 6) 3.0 Average increase 1.2 (64%) The average participant's skill in producing policy briefs increased by 64%, however this headline is based on a mean average and over-simplifies a more complicated picture. Some participants had much higher overall increases, while some showed no overall increase or even scored lower on their revised policy brief; effectively suggesting that their ability had decreased. The chart below shows the frequency of the positive and negative differences in the scores for the before and revised briefs, across the 18 participants.
  30. 30. 23 0 1 2 3 4 5 6 -3 -2 -1 0 1 2 3 4 5 Difference between Before and Revised Frequency 12 of the 18 participants improved their overall scores by between 1 and 5 points. However, 4 participants showed no overall improvement and 2 participants showed an overall decrease. In many cases this was because a participant improved their brief on one aspect to the detriment of another. For example, one participant reduced the word count of their "before" policy brief by over a third (excluding references) and also improved the readability, but the reviewers felt this was at the expense of supporting their arguments with relevant evidence and providing policy recommendations. The overall score for this participant therefore stayed the same. Challenges with writing succinctly Participants appeared to struggle with the challenge of writing concisely (keeping their policy briefs below 2000 words) and convincingly for a policy audience. Before the training, 13 out of 18 of the policy briefs were too long, with some close to 4000 words. After the training, 10 of the participants had reduced the length down to between 1000 and 2000 words but several of these participants either failed to increase their scores on other criteria or even scored worse on other criteria as a consequence. This suggests that the art of writing concisely for a policy audience is unfamiliar to many researchers and it is something they will need to practise and receive further support on over a longer time period than the current capacity building approach. The word count is of course something which the researcher can check for themselves, while other aspects of a good policy brief are qualitative and can only be judged through experience and learning from feedback over time. To see the influence that the effort to write to a reduced length has on participant scores, the average scores were recalculated with those for length removed (see below). There is still an improvement as a result of the GDNet Capacity Building workshop, however it is only 39% compared with 64% when the scores for length are included. Summary of average scores from Policy Brief review (with scores for Length removed) Average score before (out of 5) 1.6 Average score after (out of 5) 2.2 Average increase 0.6 (39%) Comparison with self-assessment The policy brief review activity used a different scoring approach to that used by the participants in their self- assessment so comparisons can only be approximate: the average increase of the participants' scores was
  31. 31. 24 64% and the average increased ability for the participants of the June and December 2013 workshops through self-assessment was 70%. From closer analysis, it appears that there is sometimes a gap between a participant's own assessment of what they have learnt and their ability to put it into action. Contrary to the results of the policy brief review, all of the participants except one 9 thought they were more able after the workshop, than before, to write a policy brief and to identify key messages from their own research that are of interest to other audiences. One might conclude that self-assessment therefore does not reflect the reality of the impact of training on participants. However, it is the opinion of the evaluators that a combination of measures, with information about the participant and their working environment are all required to establish a true picture. Observations on the policy brief analysis method In summary, the consultants concluded that the policy brief analysis method does not produce reliable evidence of ability to write a policy brief. This is because the briefs were assessed in isolation by people who lack background information on the specific context in which the brief might be used, the original research upon which it was based, the length of time spent on each version, the participants' attitudes to how valuable a policy brief would be in their environment, etc. However in piloting the method, some interesting insights have been obtained, such as the challenge researchers have in writing concisely and how the method could be adapted to be more effective and less resource intensive and these are detailed in the annex 9. Implications for M&E of capacity building The approach to monitoring and evaluating researchers’ confidence and ability to communicate their research draws heavily on Kirkpatrick’s training evaluation model which outlines four levels of outcome: Level 1 - Reaction - how the participants felt about the training or learning experience. Level 2 - Learning - measuring the participants' attitudes, knowledge and skills - before and after. Level 3 - Behaviour - looking at the transfer of the knowledge, skills and attitude when back on the job. Level 4 - Results - the effect the training contributes to in the participant's organisation or wider environment. Standard workshop evaluation forms and reflection activities such as the After Action Reviews undertaken by the facilitators help one identify how participants felt about the training (Level 1). Self-assessment scores are useful to indicate changes in knowledge and attitude (level 2), and some form of practical test can be used to assess development of skills. The pledges and follow-up interviews used by GDNet establish the extent to which learning can be applied in the researchers' native environment where factors beyond their control may constrain or support implementation (Levels 3 and 4). In its current design, the policy brief review score falls somewhere between being a practical test of skills (Level 2) and an assessment of the participants' ability to apply their learning in the immediate term (Level 3) but as the discussion above suggests, changes would need to be made in order for it to be an effective measure of either level. Based on the Kirkpatrick framework, to test objectively if knowledge has increased one should use before and after questionnaires that test knowledge and which have been based on the learning objectives. To test skills then, participants could be given shorter standardised exercises e.g. all given the same piece of research from which to draw key messages. Conclusion The results for 2013, combined with analysis of the results of the workshops in preceding years imply that GDNet continues to provide effective training and capacity building activities which demonstrate a significant and immediate transfer of confidence and ability to attendees. The GDNet team has defined an effective approach to training and capacity building and is competent and confident in its delivery. The self-assessment scores reveal interesting differences in increases in confidence and ability following capacity building workshops, depending on the type of participant (male, female, AERC researcher, Awards & Medals finalists, etc.). These merit further investigation which GDNet is unable to undertake before closure and it is hoped that another organisation might pursue these questions, and others raised during the report. 9 One participant reported that their ability to identify key messages was as high as before (4 out of 5).
  32. 32. 25 Indicator 2 – Researchers’ confidence and ability to communicate their research – sustainability of capacity building effort It has always been the understanding of the GDNet team that increased confidence and ability immediately following a capacity building event is not particularly meaningful. Rather what is more important is a long term and sustainable increase in confidence and ability, and what this means for how these researchers do their jobs. Output 2 indicator 2 assesses this using the ‘pledge’. The long term sustainability and impact of GDNet’s capacity building efforts are assessed 3 months after each workshop through a ‘pledge’. Each participant is asked to respond to the following: Question – What will you do differently as a result of attending this workshop? Pledge – ‘Within the next 3 months I will…’ All 3 and 12 month pledge follow up data GDNet received to date is presented in chronological order by workshop / event in an Excel database. A sample of the most informative ‘pledge’ statements generated in Year 3 (workshops 10 to 13) is presented below. Workshop 10 – AERC policy brief training Pledge : Write a policy brief from my research and communicate the results using social media and other media. 3-month follow up: I indeed benefited from my participation in the workshop. I was able to communicate my work on social media platforms, as well as to inform all those who are connected with me on the different platforms about my other work. The work done in Arusha was of a great importance to me and I believe the experience will be repeated again in order to allow to a bigger number of researchers to benefit from your experience. I remember having been asked in the past, in 2010, to write a policy brief out of my research that I had completed, and I did not know how to do it. Some colleagues I met in Arusha, and who had previously benefited from your workshops, told me that GDNet goes in depth, allowing researchers to understand the expected outcome of such exercise. Once again, thank you and I hope I can benefit from other GDNet workshops. Pledge : I will reach my policymaker through internet. My policy brief will reach AERC before June 25th (deadline). 3-month follow up: I have not been able to fulfil my promises because, my final report was approved subject to revision based on the comments raised by the resource persons during the biannual. Right now, the revised analysis have been accomplished and am tidying up the final report to send to the A.E.R.C. The policy brief to be drawn from this revised report will be sent to you afterwards. A brief on our study written during the policy brief training workshop could not be sent since it was overtaken by the post biannual review report. Hence the 26th deadline was not met. Workshop 11 – GDN Awards and Medals Finalists training Pledge : Be more careful about designing my presentation and making it more suitable to target audience. 3-month follow up: Yes, I think I have successfully realized the goal. After attending the workshop, I have revised the style and contents of the presentation. I worked till 2am the day before I presented my work just to achieve the 2 goals - redesign and readjust. The workshop taught me that what I think is important may not be a vital point in audience's minds, what I think is nice-looking may not be that attractive in audience's eyes. When I redesign my slides, I kept the most important points while deleted somehow less important parts. I also took great care in confining my talk within the time limit. I controlled my speech in a normal speed that audience from different discipline could understand. I also avoided putting down too technical words or econometrics formulae and
  33. 33. 26 tried my best to make the speech easy to follow. All of the above points are core elements that I learnt during the workshop. Therefore, the workshop I attended contributed to my success a lot. Thanks again for the GDN to provide such an excellent workshop. Pledge : When making PowerPoint presentation, try to be more visual and brief. 3-month follow up: I would like to thank GDNet for arranging such a wonderful training session. Though I have not made any major presentation since the training albeit I would definitely say that workshop/training helped me a big time to understand how to lay down a good power point presentation. Beforehand I would put all the nitty gritties in the presentation but now I know that you provide the skeleton of the concept specially when you have to finish it in a stipulated time. I would say I will be able to use the knowledge that I acquired from that training whenever I make a presentation. Workshop 12 – ERF workshop "Writing winning research proposals and papers" Pledge : 3-month follow up: No pledge follow up has been received from Workshop 12 participants despite the efforts to GDNet to facilitate this. Workshop 13 – AERC Policy Brief Workshop Pledge : The way to write a policy brief for policymakers, the way to communicate with other researchers, the way to disseminate the results of the research 3-month follow up: My participation in the workshop was an amazing and enriching experience at all levels. Techniques and learnings are significantly useful for me as a researcher (Associate Professor at the Félix Houphouët-Boigny University) and an economic policymaker (Director General of African Integration). In my capacity as researcher, the policy brief allowed me to extract useful and relevant information from my research work published or submitted for publication in scientific reviews. I share my policy briefs with target organisms and institutions to value my research and respond to part of my concerns. Before the Nairobi training, I wasn’t able to write an effective policy brief. As a policymaker, I communicate better with my hierarchy through policy briefs. To give you a concrete example, I am currently in Brussels for the EU-Africa Summit to be held on April 2-3, 2014. Thanks to my policy briefs on economic challenges of the Summit, I was able to work with my Minister and Prime Minister of the Republic of Ivory Coast. Files and discussion points were examined with lots of efficiency and precision. Policy briefs are from now on an indispensable tool for me. I would like to seize this opportunity to express my gratitude to the GDNet team for this initiative “Policy brief training workshop”. It is from my point of view an experience to repeat in order to allow others to benefit from that important capacity building activity. Pledge : Be able to communicate my findings through social media and fellow research groups. I will also make an attempt to get it to policymakers 3-month follow up: I must tell you that the training is yielding a lot of usefulness. One important thing (as we learn from the training) is the inclusion of policy makers right from when the research idea is being built. This I have really made use of. To this end, I wrote a proposal to PEP- Partnership for Economic Policy titled: Self-employment and entrepreneurship of youth in Nigeria: Do remittances have any role to play? I will be in Bolivia between April 30 and May 8 to present the proposal. In the proposal, we included The National Directorate of Employment (NDE),
  34. 34. 27 The Central Bank of Nigeria and Ministry of Finance. These are some of those that will implement the resulting policies from the study. In fact NDE gave one of its research officers to join the our research team. In essence, its a sort of collaboration stuff which I do hope would lead to meaningful policy recommendations that would be implementable. As well as generating new pledges and follow up in Year 3 from workshops 10 to 13, GDNet has also engaged past GDNet training and capacity building recipients to provide 12-month follow up on their pledges. Due to an expected non-response rate over pledge, 3-month follow up, and 12-month follow up, the sample after 12 months is much smaller. Nevertheless, a significant number of attendees have responded with 12-month feedback on their pledges. A sample of these from workshops conducted in the baseline, Year 1 and Year 2 is provided below: Workshop 8 - AERC Policy Brief Training Workshop Pledge : Try to disseminate my research outputs in line with what I have learned; give my first press-conference 3-month follow up: I had planned for the country workshop this December, unfortunately, I had a poor response from policy makers because of timing and I am postponing it to February/March next year. I have not been able to give a press conference yet. I have not chickened out. I will do it at the appropriate time. Please, I feel like discussing this idea with you. I need help to be connected to people and organisations that can help me kick start it. 12-month follow up: The workshop took place on 14th of March, 2013 at Sheraton Hotel in Pretoria; and I prepared a workshop report to AERC. Policy-makers attended from various government bodies (National Treasury, National and provincial departments of health, economic Development Department etc), universities and civil society. In my view, it was encouraging with a level of satisfactory engagement from the audience. Pledge : Finalise my policy brief on the AERC project; develop two more policy briefs under the future agriculture consortium. I will be glad if GDN can provide me with feedback on the draft policy brief. 3-month follow up: I managed to complete the 3 policy briefs and submitted them for review. The workshop assisted me very much to understand that a brief is nothing without recommendations or what the policy maker can take home. I have tried to build that into the ones i have done. 12-month follow up: I should say i haven’t been able to communicate my research recommendations with any of the targeted audiences. However, I am positive, I recently joined the Lilongwe University of Agriculture and Natural Resources as a Lecturer which gives me more leeway in communicating my research work in different forums. My research work is also on the Future Agricultures website and Research gate where it is available for download and public view. Pledge : In three months, I will identify my target policymakers in order to hold a dissemination workshop of the results of my work on links between economic growth. 3-month follow up: Regarding the dissemination, we had a slight delay on the organization as AERC just sent us the grant agreement that we signed and returned. We have chosen the Ministry of Economy and Planning as the godfather of the ceremony and if AERC sends us money, we will arrange the release in late October or early November. 12-month follow up: The dissemination workshop of the results of the study on the relationship between economic growth and poverty reduction went well on January 22nd, 2013; with forty participants who are decision makers economic,
  35. 35. 28 political, donors, students, researchers, members of the management team of the National Poverty Reduction. The public and private media have also widely distributed; briefings to Chad radio, television and private newspapers were made. We received congratulations from everywhere and even the Minister of Planning and Economy. Another benefit of the workshop which I did not expect is that the Economic Community of Central African States (ECCAS) will organize a workshop on modeling agricultural growth and poverty reduction in the framework of the implementation of CAADP in Libreville from 18-20 February 2013. Thanks to the dissemination workshop reported in the media, they contacted me and asked me to go and represent Chad in this conference. As I could not go because of my commitments with AERC in Arusha, Tanzania, I have appointed another colleague who represented Chad in Libreville. According to the project leader, I might be requested to provide technical support during the entire course of the project. This shows how a dissemination workshop is very beneficial for researchers because it make them known by the public. Workshop 9 - AERC Policy Brief Training Workshop Pledge : I will prepare a policy brief from my research project and a new article for consideration by the press/media. 3-month follow up: I am in a position to prepare both the policy brief and media article using the knowledge gained from the workshop. We have not published the media article because my research colleague advised that we could share it with the media after the dissemination workshop of our project to policymakers and academics within Uganda. I intend to distill policy briefs for some of my other research work and if acceptable I could share them with you or Andrew for review. 12-month follow up: I am happy to inform you that the paper was accepted and published in the Journal of African Development (JAD) - Spring 2013, Volume 15 # 1 with the other papers in that project. I have prepared the policy brief and media article are shared with my co-author for review before publication. However, as indicated in my earlier mail, we will publish the briefs after the dissemination workshop which we have not been in position to organize to-date. I can however, share with you a copy of the policy brief from my other working papers prepared using the knowledge gained from the policy workshop. I have not prepared any media articles for my other working papers. Pledge : I will attempt to make my research more relevant. 3-month follow up: The greatest obstacle is really breaking the barriers to reach out to the intended research consumers, in particular those in policy arena. Nonetheless, efforts in this direction are on-going. 12-month follow up: It is great for me having to hear from you and learning that you're keen following up on the impact of the training we had. As you may recall, one of the key issues in making research relevant was knowing issues that policy makers grapple with, and being able to deliver timely and robust research output on such key issues. Over the past one year (2012), I have explored this, taking advantage of working in a policy environment and undertaken two important research projects addressing pertinent issues in the formulation and implementation of monetary policy at the central Bank of Uganda. - One was an analysis of exchange rate pass through to domestic prices in Uganda, and recently published in the Journal of Statistics and Econometric Methods. The estimate in here is the current evidence based statistical justification the Bank of Uganda is using to dampen exchange rate volatilities.  During the same year, the government of Uganda anticipated funding the national budget deficit using domestic borrowing after major donors suspended budget support to Uganda. Whether this form of deficit financing wouldn't compromise macroeconomic stability was a significant concern. This is what I have addressed in the second paper, but is yet to be disseminated to the wider public as it awaits initial publication in the Bank of Uganda Staff Working Paper series. Internally however, results are being

×