2. STUDENT INITIATED PROJECT
In January 2016, I began work on developing two evaluations for the Harn Museum.
With lots of help from the Education Department, specifically my supervisor Eric
Segal, I started a program evaluation from the ground-up.
3. WHY EVALUATION?
In order to prepare for my thesis which will be presented in Fall 2016, I wanted to
fully experience the program evaluation process myself.
I had a few questions:
How does an evaluator begin to understand what program developers
need to know?
How intensive is working with an IRB, further, what are the best practices
for evaluations?
What challenges might someone face during an internal program
evaluation?
4. WHY DO PROGRAMS NEED TO BE EVALUATED?
The Harn’s Mission
“The University of Florida’s Samuel P. Harn Museum of Art collaborates with university and
community partners to inspire, educate and enrich people’s lives through art. The museum brings the joy
of experiencing great works of art to diverse university, community, national and global audiences through
relevant and enlightening art collections, exhibitions and learning opportunities.”
Education Action Plan
• Our goal is to stimulate the understanding of art and to promote rich experiences that spur
creativity, pleasure, intellectual engagement and fun.
• Actively involve community and campus representatives in educational planning through
consultations, surveys, advisory committees and other modes of engagement.
5. NUMBERS
The following statistics were collected by Amy Gordan, PhD and refer specifically to AAM
accredited art museums in Florida.
36.7% of the survey respondents use staff as internal investigators
12% used external investigators but only for purposes of applying for a grant
38.9% of museums that offered gallery activities, learning resources and libraries utilized visitor
studies or evaluations
100% agreed that evaluation was or would be a helpful tool for museums
-Then why don’t they use it?
6. DEVELOPING A TOT TIME EVALUATION
• Planning with Brandi Breslin and Eric Segal
• What is Tot Time?
• What are we hoping to learn?
• Development and finalization of survey
• How did I keep my observations consistent?
• What does “take the lead” mean?
• What does “act independently” mean?
• What does “engage with others” mean?
• How did I define parent instruction?
• Institutional Review Board approval
• Ongoing surveys
• Subject selection
• Docent details
• Data evaluation
7. WHAT DID WE LEARN?
• 50% of parents attend Tot Time so their children
can have fun
• 75% of parents hoped their children would learn
something about art or museums
• 62% of parents believe that Tot Time will influence
their child
• 25% left this question blank
• 63% of families that attend Tot Time also attend
other voluntary educational programs elsewhere
Frequency of Visits
1st Time
2nd Time
10th Time
8. WHAT DID WE LEARN?
• Children who acted independently frequently (more than 6 times) were 39% more likely to use a
new skill or new vocabulary
• The less instruction the parent provides, the more likely the child is to act independently and
engage with others
• Children of parents who intervened more than 4 times used a new skill or vocabulary an average of 1.6
times and took the lead an average of 2.8 times
• Children with parents who intervened less than 4 times used a new skill or vocabulary an average of 5.7
times and took the lead an average of 7.5 times
• 100% of the children worked on their art projects alone the majority of the time
• 82% of children engaged with attendees that were not their guardian
• Overall, every child/guardian group engaged in thoughtful conversation or laughed together at
least once during the Tot Time experience!
9. WHAT DID I LEARN?
• Importance of a pilot survey
• Evaluators must be comfortable accepting “no”
• Multiple pens are a must
• Follow-up surveys are a challenge
• Developing a survey takes practice and time- each question you ask should
relate back to what the goal of the evaluation is
• Less questions equal better answers