Key Points:Partnered with the SFA to contact all known Science Festivals at that time. 19 Festivalsresponded in 2011; 21 responded in 2012 - 78% of known Festivals
Key Points:For 2013, half spend less than 5K on their evaluations. Other half range from 5K – 25K.Most Festivals spend less than 5% of their overall budget on evaluation.Most Festival evaluations include three evaluation activities. Those with larger budgets include a higher number of activities.
Key Points:Method planned for use by largest number of Festivals in 2012; 80%. Interest in SFA survey format and qualitative means for collecting these data was also high (76% interested in each).SFA has a Host Survey that they administer every year. NC used a survey for Event Hosts and an interview for Sponsors.
Key Points:Surveys are the method most commonly used by Festivals to collect data from attendees; only about half of the Festivals collect these data.These are usually done as intercept surveys so that the data are collected at the event itself. Email surveys are sometimes used if event registration captured their email address. AZ did follow-up surveys that go out one month after the Festival.Since 2011, SFA has used new and returning attendee survey; primary benefit is related to behavior change questions.Length has been a challenge. SFA surveys have ranged from 1 – 2 pages across time; most recent version is one page. NCSF survey was 1.5 pages last year – too long. Seattle and Arizona each used ½ pagers at some events as well, and quite happy with the results.NCSF tested the idea of using two forms this year to gather as much data as possible; questions core to the mission were included on both forms and additional questions of interest were spread across the two.
Key Points:Attendee surveys require a team of people to collect the data at events. SFA hires a team for each site (e.g., 15 for San Diego).NC used a few strategies – we hired teams in two areas of the state and then partnered with a college professor who integrated the data collection into her students’ practicum experience.AZ also worked with students. They had 2 graduate students who led the team, and then 8 undergrads who were paid through the University’s apprenticeship program.Seattle used volunteers who self-identified as being interested in helping with the evaluation. We tried using volunteers in San Diego in 2009, and it did not work well (but they didn’t self-identify either). An interest in helping with the evaluation may be key. All agree that it is important to have evaluators whose sole role is data collection.Richard doesn’t believe that any of these models are sustainable – he is interested in working with host sites to identify staff who can play this role at the event.
Key Points:Most are still using pencil/paper methods to gather data from attendees. In the end, at least four sites tried using iPads in 2012. Two of the three sites who tried to use iPads in 2012 prefer pencil-paper. One would choose to use them if using a longer survey.Lessons learned: There is no time saved in the collection of data, but you do save time on data entry and you get to see results right away, rentals and getting service, page set up.
Key Points:On the needs assessment, K-12 evaluation was a key interest. We didn’t ask about this topic in particular, but we did ask for additional suggestions. This was the top category mentioned, and it was often mentioned in conjunction with the need for longitudinal evaluation to measure longer-term impacts.NC has teacher, parent, and student surveys for Festival nights that occurred at elementary schools. NC and San Diego each evaluate high school-level programs that match a scientist with a high school classroom to present on a topic of interest. Evaluation of each has gathered feedback from teachers, students, and scientist presenters.Don’t have any models yet for school-level follow-up, that I am aware of.
Key Points:At IPSEC in 2011, we talked about the need to capture the scientist-public interactions that occur at the Festival. NC pilot tested the use of a Secret Shopper protocol this year by using it at Expo booths. The protocol documented key features of each booth, including several best practices that were recommended in the How To video that Philly created. Found that the data we collected through our field researchers did differ from those collected from attendees at the Expos, so it is a unique data point. Even so, there are trade offs and so we may not collect these data again next year.
Key Points: Top rated item on the needs assessment, 90% interest. AZ used a polling service to ask questions related to awareness of the Festival. Administered as a pre-post, before and after the Festival. In future years, hope to administer as a post only. Results showed a 10% increase in awareness after the Festival which equated to approximately 600K people being aware of the event. Questions were added to an omnibus survey. Cost is between $800-1000/question.
What we know about science festival evaluation
What We Know about Science FestivalEvaluation Karen Peterman, Ph.D. Spring 2013