Presentation to NIDA Masters of Fine Arts in Cultural Leadership, covering the role of evaluation in the arts, how to plan an effective evaluation and tips and traps for evaluating in practice.
2024: The FAR, Federal Acquisition Regulations - Part 28
The art and science of impact evaluation in the cultural sector
1. August 2020
The art and science
of impact evaluation
in the culture sector
Tandi Palmer Williams
2. Name, role
How you feel about ‘evaluation’
Any specific interests for this session
3. 1. Role of evaluation
in the arts and
culture sector
What we’ll cover
2. An example: what
works well, what
can be challenging
4. Stepping through
your assignment
My goal is to leave you with
insight into the reality of
culture evaluation…
and ultimately to help our
sector build a strong,
ethical, creative data
culture.
5. Tips and traps
3. Some resources that
I’d recommend 6. Questions
5. HOW WE THINK66% of Australians agree the arts should be
publicly funded, down from 85% in 2009
5
Proportion who agree with the statement ‘the arts should receive public funding’
Australia Council, National Arts Participation Survey 2016
85%
79%
66%
2009 2013 2016
6. Arts and culture is among the last to ‘catch the
data train’
6
Results from ICDA’s national not-for-profit governance survey
38%
of NFPs collect outcomes
data
13%
of NFPs don’t measure
success in any way
8. Organisations that are confident collecting,
analysing and using data…
8
‣Are in tune with their audiences and
communities
‣Aren’t afraid to ask questions…and
hear answers they weren’t expecting
‣See feedback as fuel for improving
‣Genuinely want to know if they are
making a difference… and how they
can do better
‣Grow more and more confident in
their role and value
‣Are recognised for their achievements
‣Attract in-bound opportunities for
their community, their organisation
and their staff members
‣Build morale and staff retention
‣Have good relationships with funders
‣Find reporting easy and useful
‣Find it easier to develop great funding
applications
‣Achieve their goals, in less time
‣Create more and more good in the
world.
9. How data powers cultural organisations to do
even more good in the world
9
The Five I’s
of Data-Powered
Organisations
Insight
Gathering data to
deeply understand
their community
helps them target
their work and spot
opportunities.
Innovation
Prioritising, testing
and refining
experiences based
on robust feedback
means they are
confident in taking
risks.
Impact
Evaluating the
impact of their
programs means
they are
continuously
improving and
demonstrating
value.
Influence
Sharing data and
thought leadership
means they are
influential
conversations and
recognised as
leaders.
Investment
By harnessing
evidence they
attract inbound
opportunity and
build powerful
cases for
investment.
10. Collecting more data… …using the data we have in our decision-making
Sending more surveys… …making sure our surveying is worthwhile
Learning advanced Excel… …knowing what kind of analysis is needed
Hiring more consultants… …building our internal capacity
Doing more evaluations… …being more strategic in using evaluation
Having all the answers… …asking better questions
Reacting to requests for information… …being proactive in offering evidence
Demonstrating our value… …finding ways to delivery even more value
So, should we all become analysts and
evaluators?
10
It’s less about… …and more about
11. Collecting data is just a small step in an effective
evaluation practice
11
Process diagram: Data to Impact
Framework
design
Data collection
Analysis and
interpretation
Action
planning
Communication
and
implementation
Outcome
12. ‣ You’re aiming for a step-change in your reach or impact
‣ Funding for your work is lapsing in the next 12 months – or you’re looking for a new partner
‣ You’re trying something new (e.g. a pilot) and want to determine whether to continue
‣ Engagement levels are falling or have not met expectations
‣ You’re hearing whispers of discontent or people having negative (or mixed) experiences
‣ It’s been three or more years since you’ve done any formal evaluation*
‣ You’re totally occupied with delivering – e.g. in festival mode
‣ You don’t have any big questions right now
‣ You already have a lot of data / you haven’t yet acted on previous evaluation
When to think about conducting evaluation… and
when to give it a miss
12
Good times:
Less good times:
21. Audience Research Toolkit: Project Planning Template
1. Title/program/organisation
Give this project a working name.
2. The opportunity
Why is this research needed? What is the opportunity it addresses?
Y/N Detail
Understand attendees
Increase attendance
Build a new audience
Other
3. Audience for the findings
Who is the research for? How will the results be used? When will the
results be needed? In what format?
Audience When Format
4. Target population
Who is the subject of this research? What group do you want to find out
about? Can you estimate the size of this group?
5. Areas of enquiry
What kinds of questions are you interested in? What topics do you want
to explore?
6. Methods
What methods could be right for this project?
7. Tools, templates and guidance
Using the Audience Research Toolkit, filter the resources to identify
resources that could help you, and list them below.
Tools Templates Guidance
22. 8. Roles
What are the key roles on the project – and who might fill these
positions?
Project lead
Project manager
Contractors,
consultants, partners
or volunteers
Other stakeholders
9. Resources
How much time & money is reasonable to spend on this project? How
much is available?
Low range High range
Budget $ $ $
Staff time
10. Process
What are the key steps and stages to completing this project? You
might like to think about planning, fieldwork, analysis, interpretation
and reporting.
1.
2.
3.
4.
5.
6.
11. Timeline
What is the ideal timing for this project? What key dates should the
project work towards? E.g. sector forums, board meetings, holidays.
Start
Finish
Other relevant dates
12. Success factors
What does success look like? What will help ensure the research delivers
the desired insights & outcomes?
13. Risks and challenges
Thinking through the steps, stakeholders involved and resources. What
are the potential stumbling blocks? What risks need to be managed?
14. Questions
What advice do you need? Note your questions down here.
1.
2.
3.
24. EVALUATION PROGRESSYour assignment
24
1. Select a program delivered by a subsidised arts
or cultural organisation. Refer to examples
provided to ensure you choose a suitable
program.
2. Develop a one-page ‘program logic’ using the
template provided. This should identify:
1. Social issues that the program addresses
2. Activities and outputs of the program
3. Short, medium and long-term outcomes.
3. Summarise the extent and prevalence of the
relevant social issue(s), drawing on the literature
(including grey literature where relevant).
4. Identify and describe the size of the target
population for the program.
5. Articulate the value proposition of that
organisation using a formula such as ‘[Program
A] delivers [activity B] to [population C] to
achieve [short term outcome D] and [long-term
outcome E].
6. Design a concise evaluation framework, which
identifies:
• An overall evaluation question
• A set of (quantitative) key performance
indicators for the program – covering activities,
outputs and outcomes • Areas of enquiry for
qualitative research.
7. Discuss the potential research methods that
could be adopted to answer the overall
evaluation question.
25. EVALUATION PROGRESSThe goal of the assignment is to plan and design
an evaluation
25
Analysis &
reporting
Interpreting the
data to identify
key findings and
draw a conclusion
Fieldwork
Collecting data via
desktop review of
available documents,
interviews and
surveys etc.
Design
Developing a
framework, key
question and
defining your
terms
Planning
Identifying an
interesting case
study and planning
your evaluation
process
1 2 3 4
28. EVALUATION PROGRESSAn example of how this works….
28
Problem
• What is the
key social
issue that the
program
addresses?
• e.g. social
isolation of
elderly people
in regional
Australia
Target
population
• Who is the
program
serving? What
is the primary
group of
people it
targets?
• e.g. residents
of regional
areas aged
75+
Program
activities
• What does
the program
offer?
• e.g. music
performances
+ morning tea
+ transport
service
Outcomes
• Music brings
people
together
• Elderly people
increase their
social
interaction,
and feel less
isolated
• Elderly people
live longer,
happier lives
Assumptions
• Elderly people
want to use
this service,
and are able
to
• Attending arts
events makes
people feel
less isolated
• Music is the
key ingredient
29. Project objectives
‣ What is the reason for doing this evaluation?
Who is it for? Why is it important? E.g.
‣ After a one year pilot, evaluation is needed to
determine if the program is achieving it’s objectives,
and assess the case for full scale rollout.
Evaluation question
‣ What is the overall question you are trying to
answer? Can it be summarised in a short
sentence? E.g.
‣ How does Program X address social isolation among
older residents?
‣ Did the program shift perceptions about people with
a disability?
EVALUATION PROGRESSGood evaluations have well defined objectives
and a clear evaluation question
29
Areas of enquiry
‣ Reach: How many people used the service? What
is the customer profile and how does this
compare to the profile of all elderly residents in
the area?
‣ Experience: How did the users find the
experience of using the service? How satisfied
were they? Did usage grow during the pilot
period?
‣ Impacts: What impacts did users report? Did the
program impact users’ subjective wellbeing?
‣ Learnings: What were the successes and
challenges of program delivery? What learnings
can be applied during full scale roll out?
30. ‣Literature review
‣Conduct interviews*
‣Run a focus group*
‣Observe attendees in a cultural space*
‣Develop case studies
‣Count attendees*
‣Analyse attendance trends*
‣Build a wider community profile*
‣Understand your digital data (analytics)*
‣Send a short survey*
‣Send a strategic survey*
* - Visit the Audience Research Toolkit for further
guidance
Consider what data you need, from how many
people, and what is achievable for you to collect
30
Qualitative Quantitative
32. Things to watch out for when conducting
research and evaluation:
Patternmakers - Data After Dark 32
The vanity metricThe data kitchen
The echo chamber The too hard basket
The dusty shelf The HPPO
33. ‣ M: Motivation: Use the guide to incentives to design your offer
‣ E: Execution: Think carefully about the right format, and how it can be accessible
‣ T: Timing: Carefully pick your timing to deliver results when you need them
‣ R: Reliability: Identify your population and calculate your sample size
‣ I: Invitation: Craft a compelling email & subject line
‣ Q: Question design: Get a second opinion to ensure your questions aren’t leading
‣ S: Stop: Pause, postpone and/or reduce your surveying if you’re not getting value from it.
When surveying, remember you METRIQs
33
34. ‣ Asking audiences what they want…. but giving them something they could never dream of
‣ Challenging conventional thinking… but respecting artistic decision-making
‣ Being analytical…. while thinking creatively
‣ Speaking the language of funders… while being true to the art
‣ Celebrating positive results… and embracing areas of learning
‣ Demonstrating your value… while finding ways to deliver even more value
‣ Meeting increased expectations for evaluation …. In the context of stretched resources
The art of doing research and evaluation in
creative contexts
34