Your SlideShare is downloading. ×
  • Like
Measuring Communications Impact at EPRC
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Now you can save presentations on your phone or tablet

Available for both IPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Measuring Communications Impact at EPRC

  • 1,123 views
Published

Presenter: Elizabeth Birabwa …

Presenter: Elizabeth Birabwa

Podcast: http://bit.ly/1jUBny1

Elizabeth Birabwa is the programme manager of at the Economic Policy Research Centre (EPRC) in Kamapa, Uganda, presents EPRC’s tools for measuring communications. Elizabeth has over 14 years of experience in advocacy, communications, media relations and information management. She holds a bachelor's degree in Mass Communication from Makerere University and a Master's Degree in Library and Inform

Published in Education
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
1,123
On SlideShare
0
From Embeds
0
Number of Embeds
2

Actions

Shares
Downloads
10
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. 1
  • 2. Outline  About EPRC  Results Framework at EPRC  Why Measure Communication  What we measure  How we measure  Challenges of Impact Measurement  Lessons Learned 2
  • 3. About EPRC  EPRC is a Uganda based Policy Think Tank  Mission: Foster sustainable growth & development of Ugandan economy by advancing the role of research in policy processes  Conducts research, policy analysis & advice, engage in policy outreach and engagement  Have a Four (4) year strategic plan that guides programmes and;  A Policy Engagement & Communication (PEC)Strategy 3
  • 4. EPRC Results Management Framework  To improve performance and implementation of the Strategic Plan, EPRC adopted a results management culture that focuses on outcomes and impact  EPRC has developed an in-house M& E data collection tool for tracking and reporting on results  M&E function and tool administered by the Information and Dissemination Unit  Indicators for measuring communication are integrated in the tool 4
  • 5. Why Measure Communication?  Ensure that our information products and services remain of the highest quality and reach our target audiences in most effective manner  Specifically to:      Have well-crafted information products and services Improve management of product development, production & distribution Ensure we are reaching the intended audiences, in the right way and right time Assess the effect or impact of our products and services Increase use of the products & services and in turn improve uptake of research in policy processes. 5
  • 6. What we Measure?  Reach: That is the extent to which information is distributed, redistributed and referred to i.e breadth and saturation  Usefulness: The quality of products and services i.e. if appropriate, applicable and practical  Use: What is done with knowledge gained from our information products and/or services? 6
  • 7. How we Measure?  Standard of monitoring if products & services meet requirements that make them effective and useful to policy makers  Monitoring tool used to monitor outputs and outcomes  Incorporated 10 communication related indicators within the tool  Rely on other specific tactics for collecting information for each indicator as will be discussed 7
  • 8. Indicators--Reach Indicator Data captured Data source Significance 1. # of copies of a # of copies sent product distributed to # of people on existing lists in mailing list hard and or electronic forms EPRC contacts Database Helps to reach out to many people with our information products 2. # of copies of a # of copies, product venue, and date distributed thru distributed. additional distribution e.g. training, workshop or meetings Administrati ve record in form of Distribution form Helps track distribution that occurs in tandem with related events, hence increases chances of information being understood and applied. Helps build demand for the products. 8
  • 9. Indicators—Reach & Demand Indicator Data captured Data source Significance 3. Numbers of products distributed in response to orders/user initiated requests # of phone orders, # of email requests # of interpersonal requests. Admin records/form by Knowledge Management Specialist Helps to measure demand and thus indirectly gauges perceived value, appropriateness, and quality. 4. Number of file downloads in a time period refers to Internet user’s transfer of content from EPRC Website to own storage medium. Web server log files. Web analysis softwareGoogle Analytics run by IT Specialists Helps to know the information products and topics that are used most on the website and which countries or regions are using the website most 9
  • 10. Indicators—Reach & Demand Indicator Data captured Data source Significance 5. Media mentions of EPRC events, staff and products Media outlets, Admin records/form by Knowledge Management Specialist Helps to measure demand and thus indirectly gauges perceived value, appropriateness, and quality. Admin records/form by Knowledge Management Specialist Helps not only to capture reach but also proxy measure of quality since librarians will ask for what they believe is beneficial to their clients/users # and description of articles and radio/TV broadcasts; Audience reach of media outlets. 6. Number of instances that products are selected for inclusion in a library or online resource #number and type of publications selected; # and type of library or information Centre/online resource 10
  • 11. Indicators—Reach & Quality Indicator Data captured 7. # of events held Participants Capture reach but also the registration demand and feedback on Number of form events participants by gender and type of sector e.g. NGO, public, private, Media or donors. Number of policy engagements organized e.g. workshops, conferences to share and/ or discuss policy emerging issues or to disseminate research findings. Data source Significance 11
  • 12. Indicators—Usefulness Indicator Data captured Data source Significance 8. Percentage of users who are satisfied with a product or service Qualitative covering user likes and dislikes, attitudes using scales e.g. how strongly they agree or disagree with statements on frequency, subject categories and technical quality Feedback forms distributed with the product by the KMS who receives feedback via email However the ideal would be to use other forms of surveys, online, telephone surveys or interviews with users but this requires time to develop tools and analyze data 9. Percentage of users who rate the content of a product or service as useful Qualitative recording relevance and practical applicability of the content e.g. Was the topic(s) covered in the product interesting and useful to you?” Data sources Ideally user surveys would be most used are appropriate but not done due to feedback forms reasons same as above distributed with the product 12
  • 13. Indicators—Use Indicator Data captured Data source Significance 10. Number of users using an information product or service to inform policy Data captured is number of policy recommendations provided to clients; Mainly use informal (unsolicited) feedback; that comes via email on the various policy recommendations, (phone or in person); The challenge with getting this information is that users may not recall what particular information source they used, information may be used but not referenced Number of recommendations used by clients, Evidence to show how recommendations have been used Review of copies of policies, guidelines, or protocols, referencing or incorporating information from products or services. 13
  • 14. Challenges  Harmonization of different data requirements from different projects and data sources  Consistency: How to capture information systematically in ways that are straightforward and on a regular basis  Measurement: of outcomes from different information products and services varies, and lead to a wide range of impact and influence; most of which is intangible and very hard to measure  Relationship building: not everyone within the department or organization may be motivated to embed M&E within their work;  Time: The process of collecting data and information as well as analyzing it may span over a long period of time to reflect the impact of communication products and services.  Tools and data collected keep changing e.g. advent of social media and web based resources has brought new ways of sharing information and thus monitoring 14
  • 15. Lessons learned  Not one tool suits all communication M& E requirements, there is need to have various tools that capture specific communication products and services  The tool design requires full participation of all parties that will be involved in using it  Continuous capacity building in tool use to facilitate regular capturing of data 15
  • 16. 16