At the UKES 2016 Conference James Ronicle presented on Ecorys's approach to communicating evaluation findings, including the 4 steps to communicating findings effectively.
VIP Call Girls Pune Vani 8617697112 Independent Escort Service Pune
Talk, Chalk and Tweet: UKES 2016 Conference
1. James Ronicle, Senior Research Manager
2016 UKES Conference, 27 – 28 April
Talk, Chalk and Tweet
Communicating evaluation findings to a wide
range of audiences
2. My opening gambit
This report is intended for: - Researchers
- Policymakers
- Funding bodies
- Practitioners
- Service users
There is no one size fits all approach to communicating evaluation findings
5. Commissioning Better Outcomes (CBO) Fund evaluation
• Aims:
• Support development of more
social impact bonds (SIBs) in
England
• Activities:
• £35m to part-fund up to 35
SIBs
• Evaluation:
• Assessing added value of
commissioning through a SIB
• Process evaluation, focusing
on in-depth reviews of 10 SIBs
• Analysis of outcomes data
collected by SIBs
• Strong learning and
communication strand
7. Step 1: Audience
analysis
Step 2:
Communications
strategy
Step 3:
Implementation
Step 4:
Evaluation
Step 1: Audience analysis
1. Who are the audience
members?
2. What elements of the
findings will they be
interested in?
3. What is their current level of
understanding of the topic?
4. How much detail are they
interested in?
5. What type of communication
method will they most
engage with?
6. How motivated are they to
engage with the research?
Client
Commissioners
Service providers
Investors
Organisations wishing
to grow SIB market
Advisors /
intermediaries
Audience members
Broad interest in social
investment/PbR, but almost no
awareness/understanding of SIBs
General interest in SIBs & limited
prior knowledge
Strong interest in SIBs & good
prior knowledge
Very strong interest, involved in
delivery
Level of interest
8. Step 2: Communications strategy
Step 1: Audience
analysis
Step 2:
Communications
strategy
Step 3:
Implementation
Step 4:
Evaluation
Chalk
Full report
Summary
report
Case study
Infographic
Talk
Conference
Learning
seminar
Webinar
Policy
roundtable
Tweet
Social
media
Blog
9. Step 2: Communications strategy
Step 1: Audience
analysis
Step 2:
Communications
strategy
Step 3:
Implementation
Step 4:
Evaluation
Audiences Outputs
Those with a broad interest in social
investment/social enterprise/Payment by
Results (PbR) contracts, but almost no
awareness or understanding of SIBs
• Blogs and social media from more general accounts (e.g. BIG)
• Press releases for general publications (e.g. Guardian)
• Conferences
Those with a general interest in SIBs and
limited prior knowledge
• Specialist social media
• Case studies
• Executive summaries of reports
• Webinars
• Press releases for specialist publications (e.g. Pioneers Post)
Those with a strong interest in SIBs and with
good prior knowledge
• Full end of year report
Those with a very strong interest in SIBs and
expert prior knowledge.
• In-depth reviews
• Targeted Audience Reports
• Learning seminars / peer learning events
• Investor breakfast meetings
10. Step 3: Implementation
Step 1: Audience
analysis
Step 2:
Communications
strategy
Step 3:
Implementation
Step 4:
Evaluation
Tier 1: Those with a broad interest in social investment/PbR contracts, but almost no
awareness or understanding of SIBs
General blogs & social media Conferences
11. Step 3: Implementation
Step 1: Audience
analysis
Step 2:
Communications
strategy
Step 3:
Implementation
Step 4:
Evaluation
Audiences Outputs
Specialist blogs & social
media
Infographics
Tier 2: Those with a general interest in SIBs and limited prior knowledge
Summary reports
Press release for
specialist press
Webinars
12. Step 3: Implementation
Step 1: Audience
analysis
Step 2:
Communications
strategy
Step 3:
Implementation
Step 4:
Evaluation
Audiences Outputs
Full report
Tier 3: Those with a strong interest in SIBs and with good prior knowledge
13. Step 3: Implementation
Step 1: Audience
analysis
Step 2:
Communications
strategy
Step 3:
Implementation
Step 4:
Evaluation
Audiences Outputs
In-depth reviews
Tier 4: Those with a very strong interest in SIBs and expert prior knowledge.
Targeted
audience reports
Roundtables /
seminars
14. Step 1: Audience
analysis
Step 2:
Communications
strategy
Step 3:
Implementation
Step 4:
Evaluation
Pulling it all together
CBO website
15. Step 1: Audience
analysis
Step 2:
Communications
strategy
Step 3:
Implementation
Step 4:
Evaluation
Step 4: Evaluation
• Reports: >1,100 & ~600 people actively engaged with last 2 reports
• Mailing list: 792 members
• Events: 100% delegates rated as good/excellent; feedback improving
across events.
• Press: Articles in Civil Society News, Non Profit Quarterly & Research
Matters.
• Social media: 298 followers, 137 retweets, 110 favourites, 62 mentions
• Blog: 446 views, 238 unique visitors
17. Lessons learnt
• There is no one size fits all approach to communicating findings
• It’s not about reducing everything to the lowest common denominator
• It requires a wider range of expertise than we’re used to in evaluation:
• Graphic designers
• Learning manager – not necessarily researcher, someone with
communications skills
• Who communicates the findings is important; people respond better to the
‘user voice’
• Nothing beats face-to-face interaction
• Outputs must be all linked in ‘tiers’ – start simple then link to outputs with
more information for those interested
Editor's Notes
Hello my name is James Ronicle, I am a Senior Research Manager at Ecorys UK, and today I am going to be talking about our experiences of communicating evaluation findings to a wide range of audiences.
I want to start my presentation with my opening gambit.
There is no one size fits all approach to communicating evaluation findings.
How many times do we see, or indeed put, this at the front of our reports – the intended audience, which is often a list of very different audiences.
The reality is each audience member is different. They:
- Are interested in different aspects of the findings
- Have different levels of understanding of the topic
- Want different levels of detail
- Have different levels of motivation to actually reading or listening to your findings.
And so I think that, if we are to maximise the reach and impact of our research (which I am assuming we want to do, though we don’t always), then we need to tailor our outputs to our different audiences. And that is what I am going to be talking about today.
So, through an example project – the Commissioning Better Outcomes Fund, I am going to discuss our approach to communicating findings to a wide range of audiences.
I will firstly describe a bit about the eval, before talking through the 4 steps to our communications plan, which are:
Step 1: Audience analysis
Step 2: Dissemination strategy
Step 3: Dissemination components
Step 4: Evaluation
Finally, I’m going to summarise with some of the main lessons we have learnt in communicating findings, both from this experience and others.
So firstly to provide a bit of context about the example evaluation I am going to be talking about.
The CBO fund is funded by the Big Lottery Fund and aims to support the development of more SIBs in England. The focus of the evaluation is assessing the added value of commissioning services through SIBs. The evaluation is a 9 year evaluation, comprising of 2 main elements:
- Process assessment: focusing on 10 SIBs
- Impact assessment: Reviewing outcomes gathered by the SIBs
I chose this evaluation as a case study for 3 reasons:
SIBs are relatively new and relatively complex, so communicating about them is not an easy task.
There are a broad range of stakeholders involved in SIBs, perhaps broader then your average evaluation, so we had to think carefully about how to communicate to a broad range of audiences
The evaluation had a strong communication and learning component, whereby we were obliged to disseminate the lessons learnt widely, so we had a good opportunity to adopt a range of different dissemination approaches.
And so I am now going to talk through the 4 steps we have adopted for communicating our findings.
Some caveats:
- 1. Example going to give was for evaluation where learning/comms budget was of equal proportion to eval budget, which is rare. So recognsie wouldn’t do this for every eval – but gives a flavour
2. Explicit focus from BIG to share learning, and relative free reign in content of some communication – e.g. tweets. Recognise this isn’t always the case.
The first step in planning any communications is working out who exactly the audience is for your outputs.
And not just who they are, but thinking about what they want out of your evaluation output. You need to answer a set of questions, as these will determine what communication method you will use.
What elements of the findings will they be interested in? Not everyone is going to be interested in every aspect of the findings, need to think about how you’re going to tailor it for different audiences
Current level of understanding, as this determines how much detail you go into, how much of an introduction you provide, the type of language you use
Detail
Communication method
Motivation – some people very motivated & would read report in whatever guise it is, others have very little motivation, and so need to think more about how can make it engaging.
For CBO, we identified 2 main differentiating factors within our audience that we thought were the most important aspects to consider when considering our communication approaches.
The first is audience type – 6 main audiences. These will be interested in different aspects of the findings.
2nd was level of interest/understanding. Because SIBs are new and complex, audience would have very different levels. 4 main types.
Once you have thought about your audience, and the main differentiating aspects within the audience, you can design your communications strategy.
Essentially this is matching up the different communication approaches with the different audience members.
In a very crude way there are 3 types of communication approaches available to you, although obviously there are various approaches you can take within this. They are:
- Chalk: A written output.
- Talk:
- Tweet
And so what we finish with is a set of audiences and different outputs tailored to those different audiences. We essentially created four ‘tiers’, producing a different set of ouputs for each strand.
I’m now going to talk through some of what we did as part of Step 3: Implementation, taking each strand in turn.
The 1st audience is a more generalist audience involved in wider social investment/PbR, but almost no awareness/understanding of SIBs.
The messages in this strand focus on findings within SIBs that are of interest to other areas. The style is non-technical. The audience are likely to only have minimal engagement, and so the focus is on punchy, easy-to-digest messages aimed at places where more ‘generalist’ audiences are going to be.
Hence we aren’t targeted written outputs at this audience, as it’s unlikely they’ll find, or read them. We have instead focused on social media, but also using ‘generalist’ accounts that a lot of people access. So:
- Twitter: Here we tweet about main findings, though ‘generalist’ accounts, like Big Lottery Fund, which has 47k followers
- Blogs: Short blogs summarising main points. Focused on other people’s blogs, so more generalist, like Big Society Capital. Also focused on guest blogging, as find people respond very well to hearing findings from people directly involved, not the evaluator. In this example commissioner involved blogged.
And plan going forward is to hold conferences with wider audiences.
Our 2nd audience tier is those with general interest in SIBs, but with limited prior knowledge. They are reading outputs to find out what a SIB is, how it works and the strengths and weaknesses of the approach.
And so our priority is outputs that are accessible to a broad audience, using a simple layout and language and graphics.
So for this audience our outputs include:
- Specialist blogs & social media. So we are still focusing on social media & blogs, due to their easy-to-read tone. However, we are focusing on more specialist ones, for those who have a specific interest in SIBs. So we have our own blog and twitter feed, and we put summaries of our findings on these.
- Infographics. These are an excellent way of commuicating complex messages in an easy-to-digest manner.
-Summary reports: Essentially stand-alone executive summaries, but written in a more accessble format, so more text boxes, diagrames etc.
- Webinars: Have the benefit of allowing you to disseminate the findings verbally, and allow you to answer questions. We generally use these to feed back the findings to the projects involved
Press release for specialist press: We identify specialist press, like non-profit quarterly, pioneers post, civil society news, and send them press releases. This headline is from Civil Society News, who wrote an aritcle for our report.
The 3rd tier is those with a strong interest in SIBs and with a good prior knowledge.
And this tier is easy – it is your standard full report, that we are all used to. Typical language, typical length etc.
Believe it or not we actually found a 4th tier of detail below the standard ‘full report’. We have an expert pool, made up of stakeholders engaging in SIBs, and we ran our reports past them. What we found was demand for more detail – essentially this tier is those involved, or wanting to set up, a SIB who want to understand exactly how others have done it.
Our principle for these outoputs is someone could pick up one of the information and develop a SIB from it. So the outputs for this tier are very technical and very detailed.
Outputs for this tier include:
- In-depth reviews: Very detailed reports on each SIB – their exact structure, how they were developed, advice for others etc.
- Targeted audience reports: Taking our end of yr reports and tailoring them for our 3 key audience groups – so there is a taregetd audience report for commissioners, providers and investors, including detailed information only other members in their group would be interested in.
- Roundtables / seminars: Smaller learing groups specifically tailored to different audiences. This one here was for the Ways to Wellness SIB – a health SIB, which included presenttions from the service provider, commissioner and investor to a group of organisations developing health SIBs. We have also run a policy roundtable for government departments and intend to run ‘investor breakfast meetings’ for invetsors.
And the important thing is that all these outputs tie together, so that people can work their way down from the key headlines to more and more detailif they’re interested.
So the tweets have a link to our blog, which has a link to our summary report. Most of the other ‘top tier’ outputs, like our infographic, news articles also take you to our summary report. This then links you to the full report, which links to the in-depth reviews and targeted audience reports.
And we have a website in the middle that everything is linked to – it acts as a repository holding eveyrthing. So all our outputs are on there, but also the latest blog posts, tweets etc.
We undertake evaluations, but I find rarely do we actually evaluate the impact of our own work.
We monitor engagement with each of the different channels, and how many people ‘engage’ with reports (e.g. download report, attend conference etc.).
For our end of yr 2 report we are also going to send a survey to measure not just the reach, but the impact too.
User voice – so have ppl write guest blogs; speak at learning events etc.