SCC2013 - The challenges of measuring informal science learning - Jenny Mollica

285 views

Published on

Presentation from "The challenges of measuring informal science learning" at the 2013 Science Communication Conference organised by the British Science Association - slides by Jenny Mollica

Published in: Career, Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
285
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

SCC2013 - The challenges of measuring informal science learning - Jenny Mollica

  1. 1. Informal Science LearningJenny Mollica
  2. 2. Creative Learning• Formed in 2008• Arts centre and an academicorganisation• Experimenting and developingnew models of access andlearning in the Arts
  3. 3. Core Strands• Education• Community• Young people• Public Programme• Laboratory
  4. 4. Research & evaluation• Own research & evaluationteam• Two PHD researchers• Cultural value• Artists working in participatorysettings
  5. 5. Research project: CulturalMobility through SocialIntelligence• Funded by CreativeworksLondon• Pilot Year - Barbican inpartnership with QMUL andChatterbox• Measuring emotional andaesthetic responses to our work
  6. 6. Research project: ArtworksLondon• 3 year research project• Funded by Paul Hamlyn Foundation,exploring the role of the artist inparticipatory settings•  Accreditation for artists who deliverinformal learning experiences
  7. 7. Why we evaluate• Quality• Research• Monitoring &reporting
  8. 8. Who are we evaluating for?• Mixed funding models whereby one single project may have multiplefunding streams• Example of some of our regular funders & their reporting focus:• Arts Council England – artistic processes + increasingly economicimpact• City Bridge Trust – social impact• Corporate – transferrable skills for later life• Can make it challenging for us to work towards a common set ofmetrics
  9. 9. How do we evaluate?• Evaluation frameworks• Template evaluation toolkits• Mainly proximal - limitedresources for developing distalindicators of impact
  10. 10. The Rep’s Children• Birmingham Rep’s Learning department in partnership with WarwickUniversity• All babies born at two Birmingham hospitals during two weeks in Feb2013• Offered free cultural experiences at the Birmingham Rep for the first10 years of their lives• PHD researcher• How do we measure the impact of people’s aesthetic experiences onthe footprints of their lives?
  11. 11. The Cultural Sector• No over-arching body to pull together a common set of indicators• The need for unilateral buy-in from whole host of funders on anagreed set of indicators• At present, the Arts are trying to fulfil a myriad of functions
  12. 12. Policy Making• 2008 - Brian McMaster’s 2008 review: Supporting excellence in thearts – frommeasurement to judgement• 2013 – Maria Miller: the Arts must make economic case• Cultural sector increasingly looking to use scientific methodologies tomeasure impact and learning• Arts subjects given reduced priority within the curriculum
  13. 13. Questions & Conclusions• The Arts has to prove its worth and value insociety at the moment• The need to continuously re-make the casefor the arts - data as a tool for advocacypurposes• Acquisition of data v analysis and use ofdata in a meaningful way• Sharing and dissemination of research• Are scientific methodologies the wayforwards for survival in the Arts?

×