Technology for Learning Consortium presents their Evaluation Impact Workshop at the AGU Fall Meeting in conjunction with the NASA Astrobiology E/PO Lead
1. Capacity Building in Evaluation
By
Hilarie Davis, Ed.D., TLC Inc.
Daniella Scalice, NASA Astrobiology E/PO Lead
Bradford T. Davey, Ed.D., TLC Inc.
2. Key Questions
1. What is impact?
2. How well are you measuring impact?
3. How can you improve the quality of evaluation throughout a
project’s life cycle?
4. How do you collect rigorous evidence to support your
conclusions?
3. Outcomes
In this four-hour workshop, you will use the Impact
Analysis Method (IAM) to get a diagnostic score that
both baselines your current evaluation efforts and
helps you plan to increase the rigor
Learn about the Impact Analysis Method (IAM)
Use the Impact Analysis Method to diagnose the
current rigor of your evaluation efforts
Use the Impact Analysis Method to plan ways to
improve the rigor of your evaluation efforts
4. Agenda
1. Background on the impetus for this work (5 min)
2. Pre Performance Task (15 min)
3. Claims and Evidence (20 min)
4. Impact Analysis Method – A Model Consultation (30 min)
5. Practice through each phase (90 min)
6. Practice applying IAM to another project (30 min)
7. Discussion of IAM and Q/A (15 min)
8. Post Performance Task & Evaluation Survey (15 min)
5. Background
We needed some answers:
What does impact mean?
Do “one-size-fits-all” surveys really measure our
impact?
Do numbers of people/products/events measure our
real impact?
How can we measure impact richly and rigorously?
How can we collect evidence of impact within our time
and budget constraints?
6. Pre-test
Geoff Mulligan - Studio Schools
https://www.ted.com/talks/geoff_mulgan_a_short_intro_to_the_studio_school?language=en
What needs does the model meet?
What are their objectives?
What is the design?
How was it implemented?
What evidence do they have for the outcomes?
7. Evaluation is about claims and evidence
Exercise
If I say I know how to sail,
what would you accept as evidence
for my claim that I can sail?
8. Now your turn…
Think of something you know how to do well
(claim) and
identify three sources of evidence of your
skill or knowledge
11. The Impact Analysis Method (IAM)
A Professional Consultation
1. Describe the project
2. Describe and rate the quality of the work in each phase of
the project’s lifecycle using the Project Life Cycle Quality
Rubric
3. Identify ideas to improve the project’s quality and impact
12. How to Apply the Method:
An Example of a Consultation
13.
14. Evaluation–Based Practices
to Increase and Measure Impact
Needs Assessment Determine the context for impact
Objectives Define specific impacts
Design Create plan to achieve impact
Implementation Deliver the design to achieve impact
Outcomes Assessment Measure the impact
17. Needs Assessment
What is the evidence of need? For whom? Under what
circumstances?
Fair (1) Good (2) Very Good (3) Excellent (4)
Prior experience;
“Seems like a good
idea”
Research on what
works; Literature
review on similar
programs/
products/populations
/ goals
Conversation with
and/or direction from
stakeholders (Focus
Group); Experts
review the
ideas/plan
Survey of or pilot
with potential
audience/users about
the draft program
18. Visiting scientist program: Teachers and scientists
meet at a workshop and explore connections to plan
visits
Current NEEDS ASSESSMENT: Based on what worked in prior years
to get them to know each other and then contact each other to set
up a visit (1)
Problem: Only anecdotal data on how many teachers and scientists
actually plan and carry out a visit. They don’t leave with a plan.
Better needs assessment: What will help you schedule?
Conversations with teachers and scientists reveal they both have
limited windows of opportunity for engaging with each other. (3)
Better design: Sharing their available windows of opportunity at the
workshop means they can commit to each other and a period of
time. Teachers need to come with their calendar of curriculum
topics. Scientists need to come with their available dates/periods.
19. Needs Assessment – Rating Exercise
1. Think about one of your programs
2. Use the rubric in your handout to tell the story of your
needs assessment based on what you already know
3. Pair up
4. Think together about what could be done to improve
the needs assessment for your program using the
rubric as your guide
21. Objectives - Pre Task
Program (yellow) or Impact (green)?
Explicitly teach Nature of Science principles to engage
students in STEM
Students use PBL to identify and address key climate
issues
Introduce research faculty in STEM disciplines to
advanced theories, methodologies and techniques in
education research in general and STEM education in
particular.
22. Objectives – Program or Impact?
Increase the number of students who take STEM
courses.
Increase the number of students who major in STEM
Offer the Nature of Science seminar to as many students
as enroll
Advertise the Nature of Science seminar to incoming
freshmen
23. Objectives
How clear and measurable are the objectives?
Fair (1) Good (2) Very Good (3) Excellent (4)
General direction;
Understood by team;
Agenda substituting
for objectives
Explicit, written; For
a target audience
Objectives are
SMART*: Specific,
Measurable, Action-
oriented, Realistic,
Time- bound
Logic model of
inputs, outputs, and
outcomes in place
28. Objectives: What can you impact?
Behavior
Attitude, aspirations
Skills
Interest, engagement
Knowledge
29. Field trip to satellite facility: High school
students in a STEM Academy visit a satellite
facility
Current objectives: (1)
Exposure to STEM careers and information about satellites
Problem: Only anecdotal data on how much the students learned and
benefitted from the visit
Better objectives: SMART (3)
Students will become more aware of the careers in satellite
technology as shown by studying the bios of the people they will meet
and developing questions to ask them
Students will be more interested in careers like these by their own
report of how interested they were before and after their experience.
Students will become more aware of the what satellites do, how they
are designed and used, and what data they collect and create a slide
show for other students on what they learned on the field trip.
30. What you expect
With whom
When
What you will do
Engagement
Science center public
May-Sept
Science on a Sphere
How to measure
Behavior– stop and
look, take flyer, tell
friends, pre/post
Sample Objective Components
31. Sample Objective Components
What you expect
With whom
When
What you will do
Knowledge and use
increase
Middle and high
school teachers
July 8-10, Oct 5,
March 23
Lunar PD
How to measure Curriculum integration
plans submitted after
use
32. Sample Objective Components
What you expect
With whom
When
What you will do
Knowledge, interest,
and career
aspirations increase
Middle school
students
July 15-19
Astro Girls camp
How to measure Before/after changes,
follow up academic
and career choices
33. Objectives Practice
1. Think about one of your programs
2. Use the rubric in your handout to improve your
objectives
3. Pair up
4. Think together about what could be done to improve
your objectives
34. Objectives - Post Task
Program (yellow) or Impact (green)?
Students are more knowledgeable in programming,
robotics & research
Immerse students in a real science endeavor
Students have stronger identity as a STEM thinker &
doer
Offer minds-on robotics competitions to students –
leveraging the star power of the ISS
State STEM partnerships are enhanced
35. Objectives – Program or Impact?
Students are more knowledgeable in programming,
robotics & research
Students are more interested in STEM topics and careers
Create a bridge between industry, scientists,
professionals, celebrities and students
Facilitators are more knowledgeable in programming,
robotics & research
Offer web-based curriculum and training, access to the
ISS and ongoing technical support
37. Design
How evidence/research-based is the design?
Fair (1) Good (2) Very Good (3) Excellent (4)
Series of activities;
Uses what has
worked before
Based on objectives;
Connects to
standards; Includes
contingency plans for
emerging needs
Thematic; Has
continuity;
Participatory,
personalized,
responsive; Uses
advanced organizers
Developmental;
Embeds evaluation/
reflection
38. Mission Planning: Students learn about space
exploration in online activities, followed by an
onsite simulation planning a mission to Mars
Current design (1)
Series of online lessons on space exploration
Problem: Online lessons not based directly on objectives of
the simulated mission. Design is not responsive to individual
needs and does not ask students to reflect on what they are
learning.
Better design (4)
Online lessons have objectives closely aligned to the
simulated mission, more variety in format, more choices for
students on how they learn, and more opportunity for
individual reflection on planning a mission, perhaps in
different roles as the progress in their understanding.
39. Design Practice
1. Rate the design you are given on the rubric
2. Using the rubric and your own background knowledge,
come up with at least one way to improve the design
3. Find another person with the same design (by number)
4. Compare notes on the rating and how to improve the
design
5. Share out
41. Implementation
How effective is the implementation?
(fidelity to design)
Fair (1) Good (2) Very Good (3) Excellent (4)
Post only survey or
reflection;
Follow up survey or
interview;
Web stats;
Anecdotes;
Facilitator reports
External evaluator
observes, or does
case studies;
Pre/post self-report
survey, reflections;
Post only measure
(test, retrospective
survey, task)
Pre/post measures
(tests, performance
tasks, observation);
Pre/post follow-up
Comparison group
studies (quasi-
experimental);
Experimental studies
(random assignment)
42. Student Out-of-School Club: Students do
modules from the Space Science Sequence
Current implementation (1)
Teachers prepared in a workshop to give a pretest, use the
curriculum, and give the post test
Problem: Teachers were not always comfortable
implementing the curriculum
Better implementation (4)
Since teachers are the “facilitators” in this experience, they
needed coaching, modeling, and reflection with a coach to
improve their implementation
43. Implementation Practice
1. Think about one of your programs
2. Use the rubric in your handout to rate your
implementation
3. Pair up
4. Think together about what could be done to improve
the fidelity of your program’s implementation for your
program using the rubric as your guide
45. Outcomes Assessment/Methods
How rigorous are the impact methods?
Fair (1) Good (2) Very Good (3) Excellent (4)
Post only survey or
reflection;
Follow up survey or
interview;
Web stats;
Anecdotes;
Facilitator reports
External evaluator
observes, or does case
studies; Pre/post self-
report survey,
reflections; Post only
measure (test,
retrospective survey,
task)
Pre/post measures
(tests, performance
tasks, observation);
Pre/post follow-up
Comparison group
studies (quasi-
experimental);
Experimental studies
(random assignment)
46. Field work with scientists: High school students work
throughout the year with scientists doing research in the
field, including site visits, lab work, and virtual lectures
Current assessment of impact (1)
Students do presentations at a “family night” at the end of the
school year
Limitations: Student presentations are not evaluated; pre/post
knowledge gains are not assessed; program is having a large effect
on student career choices which is only documented through
anecdotes
Better assessment of outcomes (4)
Pre/post knowledge test; reflection by students after each activity
(survey); analysis of student presentations for accuracy, insight;
retrospective survey on impact on career interests; alumni survey
47. What claims does the program
want to be able to make?
• Claims are what the managers make to secure
funding, meet priorities, and build a portfolio of
projects that make the best use of resources
• Claims are made about how, why and under what
conditions the program is delivered
• Claims are made about the impact of the program
48. Practice with Claims
1. If you get a claim, describe a rigorous design that
would provide evidence for that claim
2. If you get a design, describe what claim you would be
able to make
3. Find the person (match numbers) with the same
claim/design, compare your responses and come up
with your most rigorous design for each claim
51. Apply the IAM to Another Project
1. Think about one of your programs
2. Pair up
3. Use the rubric in your handout to rate the quality of
work in each phase of that program’s life cycle
4. For each phase, identify changes that could be made
to improve the quality/rating
5. Share out after each phase
52. Discussion
What do you think about this way of assessing
impact? Benefits? Drawbacks?
Has your perspective on your project changed?
How?
What will you do differently in the future?
53. Post Task- Identify & Rate
Literacy in Rio classrooms
http://tedxtalks.ted.com/video/Reinventing-Education-in-Rio-
De;search%3Aeducation%20innovation
What needs does the model meet?
What are their objectives?
What is the design?
How was it implemented?
What evidence do they have for the outcomes?
55. Through evaluation
we are able to collect evidence
and
develop explanatory models
of
how to bring back the wonder
for teachers’ and students’
to
know, care about, and pursue
STEM learning
56. For further reading
Planting the Seeds for a Diverse US STEM Pipeline: A
Compendium of Best Practice K-12 STEM Education
Programhttp://www.bayerus.com/msms/web_docs/Co
mpendium.pdf
Freeman et al. Active learning increases student
performance in science, engineering, and mathematics.
http://www.pnas.org/content/early/2014/05/08/13190
30111
57. Videos for more practice
Going from passive to active users - 4th graders being designers
6:58, 7:51 design
projectshttp://tedxtalks.ted.com/video/TEDxCreativeCoast-
McGrath-Davie/player
Australian school using neuroscience 1:32,
2:28http://www.abc.net.au/news/2015-08-31/neuro-science-
turning-around-school-results/6738984
Kenyan Technology 6:50, 8:30, 14:00 key findings/eval We are
born to creativity, but are taught out of it. Taught to cram, not
understandhttp://tedxtalks.ted.com/video/E-learning-in-Kenya-
Anne-Salim;search:education%20research
58. Evaluation
Please complete the short paper survey to give us your
individual feedback.
Thank you!
hilarie@techforlearning.org
daniella.m.scalice@nasa.gov
brad@techforlearning.org
Editor's Notes
----- Meeting Notes (12/12/15 15:06) -----
Types of Evidence
Evidence from authority
Documentation
Performance review
Expert review
Sometimes you know what they need. Sometimes you don’t. It’s best to ask.
Make sure your objectives are clear!
----- Meeting Notes (12/12/15 12:26) -----
How do we make objectives that are measureable?
First they have to be... specific to the impact category
Measurable - have a way to be assessed
Action-Orientated - what the student/participant will do to achieve the impact
Realistic - for the audience and time frame
Timely - be time bound - They will do it in a certain time, after a certain number of experiences, etc.
----- Meeting Notes (12/12/15 12:26) -----
NSF Identified these categories of impacts
We have used them on numerous projects in the Science Mission Directorate
It allows for roll-up across projects - here is how we affected skills, knowledge, etc.
Design is important <pause>
Design influences everything
Design connects Objectives to Implementation