EVALUATION METHODOLOGIES<br />Intro<br />Participant Perspective<br />Institution or Program Perspective<br />TRAILER<br /...
BASIC LOGIC MODEL<br />Intro<br />TRAILER<br />AUDIENCES<br />INPUTS<br />OUTPUTS<br />OUTCOMES<br />Who<br /><ul><li>Inte...
External</li></ul>What is done<br /><ul><li>Exhibits
Workshops
Publications
Etc.</li></ul>Short-term<br />Change in:<br /><ul><li>Knowledge
Skills
Attitude
Awareness
Motivation</li></ul>Long-term<br />Change in:<br /><ul><li>Environment
Social conditions
Economic cond.
Political cond.
Upcoming SlideShare
Loading in...5
×

Evaluating (Informal Science Ed./Outreach)

524

Published on

A look at evaluation methodologies, from logic models to creative approaches

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
524
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Not the most organic grouping but the logic model has gained importance/popularity and is something useful to consider for framing evaluations, particularly for writing grant proposals. Logic models are a holistic approach, making assumptions explicit, considering the big picture. It’s more of an integrative framework for analysis and can incorporate any of the specific types of evaluation methodologies. But I’ve included it here because of the approach – institution or program specificInterviews, Questionnaires and Observations are sort of the bread and butter of affordable evaluation plans. They give you a baseline of both qualitative and quantitative data to work from. Any of these on its own won’t present a very strong dataset to generalize from, but in cooperation with other types, can give you a very good picture. Again, this is directed by the organization/project specifically.Focus Groups and workshops are probably my favorite. With a small group of 6 or 8 volunteers, you can sit down and have a conversation about the how’s and why’s. Focus groups are useful to obtain several perspectives about a project, activity or a topic. Unlike one-to-one interviews people can build upon one another’s responses and come up with ideas they may not have thought of one their own. They are particularly beneficial for bringing together a range of people or stakeholders.I’ve gotten more interesting, usable applicable information from one hour long focus group than I have from studies that have 10,000 responses. Creative methods, incorporating drawing, writing and photography, can be used to evaluate activities and projects. These techniques are very open-ended, can be useful for capturing and visually displaying different people’s opinions and experiences of a project. In general, people enjoy being creative, which makes these methods a really valuable evaluation tool. Creative methods can be used to understand perceptions of certain issues, words, or topics (e.g. science, health or well being) or usage (i.e. where people do things, which places they visit). For instance, people could be asked to draw mental maps to show how they perceived a space. I lumped participatory evaluations in here, because I was running out of room on the slide, but also because it’s similar - as we read about briefly in Nina Simons book – you are involving the perspective of the participants much more closely.
  • Simplified Logic Model showing how the evaluation study is framed in.Developing a program logic model helps to ensure alignment between program activities and intended outcomes. Developing a logic model to plan programs, exhibitions, and partnerships allows all involved to examine the feasibility of implementation and potential program success, provides all collaborators with a common language to communicate within and outside the program and organization (program staff), and aids in helping others to understand program design. By making assumptions explicit, logic modeling facilitates the development of evaluation questions, communication with stakeholders, and helps to situate feedback into a continuous cycle of innovation. Inputs or Resources:represents all the possible sources of information, material, staff, or knowledge that may relate to a program or event. Outputs: The quantitative measure of the number of things produced, activities performed or points of contact made with audiences. Outputs can include an exhibition that is mounted for public display, a publication, or a press release. It can also include number of people attending a training, total attendance at an exhibition or a program plan. Outputs represent things but can not establish whether any change has occurred as a result of the activity, only that the activity was either completed or not.Audiences:Once an audience or audiences are clearly identified, a logic model will outline anticipated changes that may be experienced by a member of the target audience. And it’s important to consider both internal and external audiences. Sometimes, changes in internal audiences – museum staff, institution management or other partner organization audiences can have the most direct impact – greater encouragement of projects with risk; better facilitation; possibilities for more stable funding, etc. Outcomes:Short term: Social science literature has demonstrated that behavior change is a consequence of changed attitudes, knowledge and motivations. Long term: Generally the logic model outlines a desired long term goal, or desired social state, such as the development of a recruitment pool for future scientists that is consistent with the gender and ethnic diversity of the nation.
  • Logic Model for ChandraCan situate the efforts of the public education programs into a larger framework, and assist in the inventory and assessment of the public impacts derived from the public information and education staff’s efforts. The report provides a visual reference tool and explanation of specific components of the logic model to help situate programs and activities within the overall framework, and offers two idealized program logic models based on existing efforts and possible social outcomes that may derive from education programs that seek to enhance the public value derived from operating the observatory. Plan programsExamine the feasibility of implementation and potential program successCreate tools to communicate within and outside the organizationUnderstand program design and facilitate evaluation questions in general 
  • OVERVIEW of FETTUAn exhibition done through distributed curation with an open-source approach, and a grassroots network that brought astronomy images and their content to a wide audience in non-traditional venues such as public parks and gardens, art museums, shopping malls and metro stations. Majority of FETTU events occurred in libraries, hospitals, nature centers, even a prison. We did evaluation on a dozen US locations and found that:Surveys/interviews showed that exposure to scientific content in these settings leads to inspiration and small learning gains.The observations indicate that people were willing to spend a pretty long time for engagement in a non-interactive exhibit.The survey/interviews showed that hosting the exhibit helped to create or strengthen the organizer&apos;s place in the community and build their capacity for working with their community. Through visitor comments and those from the site organizers, many viewers apparently felt a very personal connection with the images. But…more questions than answers? For example, a big one, who were we attracting with these types of displays? Were we getting more incidental visitors? Were we attracting the less-science savvy? Did any participants follow up with their local science center or library to find out more? What percentage of tourists did we reach vs actual local community members? See that as a positive though – now I have a strong baseline of data to work with, and interesting questions to pursue for the next project…launching next week 
  • Corrected the omissions and tried to answer the questions from FETTU with the newly funded From Earth to the Solar System (FETTS).Obervation sheet - simplified
  • Corrected the omissions and tried to answer the questions from FETTU with the newly funded From Earth to the Solar System (FETTS).Survey sheet – Better and More nuanced data
  • Evaluating– no excuse not to! Particularly when dealing with public funds, there is a responsibility and need for accountability. Hard to call a project inclusive, participatory, community-based or social if you don’t try to find out what the possible outcomes and impacts were.What can you do when you’re short of time and/or funds? Which is often the case? Postcard study!Use a 5 or 7 point Likert scale (When responding to a Likert questionnaire item, respondents specify their level of agreement to a statement)To ask some directed questions; then leave room at the end for an open ended question or two (where the good stuff usually comes in).I like to aim for 50 data points per location to have enough data to be statistically significant. Digitize the data in Excel, do some averages, and see what you get.
  • Focus group study let us explore in depth just how people go about processing the images that they see and to look carefully at expert/novice differences in this processing.About 1 hour of discussion with 6 volunteers (free food and goodies)Research Question 1: How much do variations in presentation ofcolor, explanatory text, and illustrative scales affect comprehensionof, aesthetic attractiveness, and time spent looking at deep spaceimagery?Research Question 2: How do novices differ from experts in terms ofhow they look at astronomical images?
  • Van gogh
  • Turner
  • Capital IdeasWhat questions do you want to ask?What are the outputs?Who are your audiences?What outcomes do you want to measure (short term, long term)?What outcomes are useful to your funding organization(s)?If you care about numbers, count participants and impacted visitors. If you want to explore engagement, measure dwell time and ask open end questions about participant experiences. If you need mission-specific information, measure indicators that reflect the core values of your institution or funding organization. But to more effectively and more completely evaluate the impact of a project, you need to look at how it affects participants, the broader audience, community partners, and staff.
  • Transcript of "Evaluating (Informal Science Ed./Outreach)"

    1. 1. EVALUATION METHODOLOGIES<br />Intro<br />Participant Perspective<br />Institution or Program Perspective<br />TRAILER<br />LOGIC MODEL<br />Sketching out how the program will work and devising a plan to measure the outcomes (integrative framework for analysis)<br />FOCUS GROUP,<br />WORKSHOPS<br />Organized discussion with a small group to understand the how’s & why’s. Requires mediation, gives social response and perspective.<br />INTERVIEWS, QUESTIONNAIRES & OBSERVATIONS<br />Asking what visitors think or watching how they interact: building your baseline of qualitative and quantitative data.<br />CREATIVE OR PARTICIPATORY<br />Capturing open, qualitative info by participants drawing, photographing or writing responses. Involving participants in creating the eval. framework<br />Kim Arcand<br />
    2. 2. BASIC LOGIC MODEL<br />Intro<br />TRAILER<br />AUDIENCES<br />INPUTS<br />OUTPUTS<br />OUTCOMES<br />Who<br /><ul><li>Internal
    3. 3. External</li></ul>What is done<br /><ul><li>Exhibits
    4. 4. Workshops
    5. 5. Publications
    6. 6. Etc.</li></ul>Short-term<br />Change in:<br /><ul><li>Knowledge
    7. 7. Skills
    8. 8. Attitude
    9. 9. Awareness
    10. 10. Motivation</li></ul>Long-term<br />Change in:<br /><ul><li>Environment
    11. 11. Social conditions
    12. 12. Economic cond.
    13. 13. Political cond.
    14. 14. Etc.</li></ul>What is invested<br /><ul><li>Time
    15. 15. Money
    16. 16. Partners
    17. 17. Equipment
    18. 18. Facilities</li></ul>Evaluation Study: Measurement of process indicators – Measurement of outcome indicators<br />(Formative, summative and anything in between)<br />Adapted from University of Idaho: http://www.uiweb.uidaho.edu/extension/LogicModel.pdf<br />Kim Arcand<br />
    19. 19. SPECIFIC LOGIC MODEL<br />Intro<br />TRAILER<br /><ul><li>Plan programs
    20. 20. Examine feasibility of implementation & potential success
    21. 21. Create tools to communicate within & outside
    22. 22. Facilitate evaluation questions</li></ul>Kim Arcand<br />
    23. 23. FINDING THE RIGHT QUESTIONS…AFTERWARDS?<br />From Earth to the Universe evaluation strategy <br />Intro<br />TRAILER<br /><ul><li> Based on NSF ISE evaluation rec., "Framework for Evaluating Impacts of Informal Science Education Projects" (2008)
    24. 24. Used observation, survey and interview protocols for viewing audiences, as well as staff of host-partner institutions and local community partners
    25. 25. Found out good stuff…but had more questions than answers</li></ul>Kim Arcand<br />
    26. 26. 2ND CHANCES: FETTSS EVALUATION STRATEGY<br />TRAILER<br />Intro<br />Observation sheet:<br />Kim Arcand<br />
    27. 27. 2ND CHANCES: FETTSS EVALUATION STRATEGY<br />TRAILER<br />Intro<br />Survey sheet: (exploring casual vs. intentional; science “identity” & expertise level, demographics, etc.)<br />Kim Arcand<br />
    28. 28. EVALUATING ON A BUDGET: POSTCARDS<br />Intro<br />TRAILER<br />Notes:<br />Errors (sample, selection, etc)<br />Over-generalizing, Subjectivity<br />Evaluation does not = Research<br />Zip code__________<br />Kim Arcand<br />
    29. 29. FOCUS GROUPS<br />Aesthetics & Astronomy studies the perception of multi-wavelength astronomical imagery and the effects of the scientific and artistic choices in processing astronomical data.<br />Ran 3 focus groups in December 2010 for the qualitative data to accompany an online survey and in-person interviews. <br />Intro<br />TRAILER<br />Kim Arcand<br />
    30. 30.
    31. 31. G292.0+1.8<br />20,000 light-years<br />Where do the calcium in our bones and the oxygen in our lungs come from? From supernovae, the exploding stars that create beautiful “supernova remnants” such as this one. This X-ray image of the supernova remnant G292.0+1.8 shows a rapidly expanding shell of gas containing elements such as oxygen, neon, magnesium silicon and calcium that were created both during the lifetime of the star and in the explosion itself. Explosions like this dispersed elements that were necessary to form our Sun and Solar System.<br />G292.0+1.8<br />20,000 light-years<br />The oxygen we breathe, the iron in our blood, and the calcium in our bones comes in part from exploded stars such as the Cassiopeia A supernova remnant seen here. Most of the elements throughout the Universe, other than hydrogen and helium, were forged in the cores of stars. These elements were then dispersed into space when the stars exploded, later to be used as the building blocks when new stars and planets formed. Our Sun and Solar System—including Earth—contain the materials for life thanks to such previous generations of stars. <br />G292.0+1.8<br />20,000 light-years<br />G292.0+1.8 is a young supernova remnant located in our galaxy. This deep Chandra image shows a spectacularly detailed, rapidly expanding shell of gas that is 36 light years across and contains large amounts of oxygen, neon, magnesium, silicon and sulfur. Astronomers believe that this supernova remnant, one of only three in the Milky Way known to be rich in oxygen, was formed by the collapse and explosion of a massive star. Supernovas are of great interest because they are a primary source of the heavy elements believed to be necessary to form planets and life. <br />
    32. 32.
    33. 33.
    34. 34. EVALUATING: CAPITAL IDEAS<br />Intro<br />TRAILER<br />What questions do you want to ask?<br />What are the outputs?<br />Who are your audiences?<br />What outcomes do you want to measure (short term, long term)?<br />What outcomes are useful to your funding organization(s)?<br />AUDIENCES<br />INPUTS<br />OUTPUTS<br />OUTCOMES<br />Kim Arcand<br />

    ×