Taking the Mystery out ofTaking the Mystery out of
Project EvaluationProject Evaluation
Marjorie Dennis, UMass LowellMarjorie Dennis, UMass Lowell
Lisa Glickstein PhDLisa Glickstein PhD
Andover Public SchoolsAndover Public Schools
IntroductionsIntroductions
 Marjorie Dennis: NSF GK-12, RET; US EDMarjorie Dennis: NSF GK-12, RET; US ED
TAH, PT3; State ITQ, STEM PipelineTAH, PT3; State ITQ, STEM Pipeline
 Lisa Glickstein: US ED TAH, PEP; StateLisa Glickstein: US ED TAH, PEP; State
STEM PipelineSTEM Pipeline
 Participant ExperiencesParticipant Experiences
Googlisms!Googlisms!
 Evaluation is…Evaluation is…
 …”…”used to manage a project to meetused to manage a project to meet
objectives.”objectives.”
 …”…”the process of measuring the quality ofthe process of measuring the quality of
performance against standards toperformance against standards to
determine if the standards have beendetermine if the standards have been
met.”met.”
 …”…”a systematic inquiry designed toa systematic inquiry designed to
provide information to decision makers.”provide information to decision makers.”
Session OverviewSession Overview
 WhyWhy do you evaluate?do you evaluate?
 WhoWho chooses the evaluator?chooses the evaluator?
 WhereWhere do you find an evaluator?do you find an evaluator?
 WhenWhen do you start working with thedo you start working with the
evaluator?evaluator?
 HowHow do you design and carry out ando you design and carry out an
evaluation?evaluation?
 WhatWhat do you do if something goes wrong?do you do if something goes wrong?
WHY?WHY?
 Because you have to – federal or state lawBecause you have to – federal or state law
 To get feedback during the projectTo get feedback during the project

Formative evaluationFormative evaluation
 Because you want to answer the bigBecause you want to answer the big
question:question: DID IT WORK?DID IT WORK?

Summative evaluationSummative evaluation
““Research seeks toResearch seeks to proveprove,,
evaluation seeks toevaluation seeks to improveimprove…”…”
M.Q. PattonM.Q. Patton
(former President of the American Evaluation Association)(former President of the American Evaluation Association)
Adapted from: Maberry Consulting Group
WHO?WHO?
 Who is a good evaluator?Who is a good evaluator?
 Who selects the evaluator?Who selects the evaluator?
 Who decides contract terms?Who decides contract terms?
 Who will the evaluator report to?Who will the evaluator report to?
 Who will the evaluator work with?Who will the evaluator work with?
 Who owns the evaluation plan, data andWho owns the evaluation plan, data and
reports?reports?
Internal versus External EvaluationInternal versus External Evaluation
 Program guidelinesProgram guidelines
 CostCost
 AvailabilityAvailability
 Existing RelationshipsExisting Relationships
Four Steps to Select an EvaluatorFour Steps to Select an Evaluator
1.1. Decide who selects the evaluatorDecide who selects the evaluator

Project Director or Advisory BoardProject Director or Advisory Board

Head of organizationHead of organization

Grant or Business Office (lowest bidder)Grant or Business Office (lowest bidder)

Grant-makerGrant-maker
1.1. Contact potential evaluators and solicitContact potential evaluators and solicit
proposalsproposals
Four Steps (cont’d)Four Steps (cont’d)
3.3. Choose selection criteria (see Notes)Choose selection criteria (see Notes)

KnowledgeKnowledge

Experience – overall, and project-specificExperience – overall, and project-specific

CostCost

ReferencesReferences

Communication skillsCommunication skills
3.3. Evaluate proposals and selectEvaluate proposals and select
Intellectual PropertyIntellectual Property
 The funder paid for the evaluation and canThe funder paid for the evaluation and can
request data or reports at any timerequest data or reports at any time
 The fiscal agent/sponsor retains the hardThe fiscal agent/sponsor retains the hard
and electronic copies of all evaluationand electronic copies of all evaluation
materialsmaterials
 Evaluation plan, rubrics, data and reportsEvaluation plan, rubrics, data and reports
all belong to the project (not to theall belong to the project (not to the
evaluator)evaluator)
WHEREWHERE??
 WhereWhere do you find an evaluator?do you find an evaluator?

In houseIn house

ExternalExternal
 WhereWhere does one learn to be an evaluator?does one learn to be an evaluator?
 WhereWhere are evaluators certified?are evaluators certified?
 WhereWhere do external evaluators work?do external evaluators work?
Professional EvaluatorsProfessional Evaluators
 American Evaluation Association (American Evaluation Association (
http://www.eval.org/http://www.eval.org/))
 Graduate & Certificate Programs (Graduate & Certificate Programs (
http://www.eval.org/p/cm/ld/fid=43http://www.eval.org/p/cm/ld/fid=43))
 There is no single recognized credentialThere is no single recognized credential
 Consulting or in a university setting (Consulting or in a university setting (
http://www.eval.org/p/cm/ld/fid=108http://www.eval.org/p/cm/ld/fid=108))
WHEN?WHEN?
 WhenWhen do you select the evaluator?do you select the evaluator?
 WhenWhen do you work out terms of hire?do you work out terms of hire?
 WhenWhen do you have the first projectdo you have the first project
meeting?meeting?
 WhenWhen and how often will you meet?and how often will you meet?
 WhenWhen do you expect reports?do you expect reports?
Initial ContactInitial Contact
 Involving the evaluator in project planning, orInvolving the evaluator in project planning, or
planning evaluation during (not after) projectplanning evaluation during (not after) project
design is a good ideadesign is a good idea
 An evaluator can prepare an evaluation plan forAn evaluator can prepare an evaluation plan for
a proposal (usually at no cost)a proposal (usually at no cost)
 Sometimes it wins points to name an evaluatorSometimes it wins points to name an evaluator
in the grant proposalin the grant proposal
 Be aware of bid laws in your stateBe aware of bid laws in your state
 Post-award some changes to the evaluation planPost-award some changes to the evaluation plan
are okayare okay
Contract and TermsContract and Terms
 Scope of services – list of activities andScope of services – list of activities and
work products (see Notes)work products (see Notes)
 Cost of entire contract, or hourly rate andCost of entire contract, or hourly rate and
estimated total (8-15% of total grant is toestimated total (8-15% of total grant is to
be expected)be expected)
 Automatic termination if grant ends,Automatic termination if grant ends,
options to terminate for cause or no causeoptions to terminate for cause or no cause
 Protections for organization from failuresProtections for organization from failures
or omissions on part of evaluatoror omissions on part of evaluator
HOW?HOW?
 How do you design and carry out anHow do you design and carry out an
evaluation?evaluation?

Create evaluation planCreate evaluation plan

Select indicators and benchmarksSelect indicators and benchmarks
 How do you collect and analyze data?How do you collect and analyze data?

Analyze dataAnalyze data

Write reportsWrite reports
Evaluation PlanEvaluation Plan
 Project’s goals, activities and timeline includeProject’s goals, activities and timeline include
data collectiondata collection
 Goal = ultimate change project intends to make inGoal = ultimate change project intends to make in
a condition (e.g. poverty)a condition (e.g. poverty)
 Indicators = changes in attitudes, knowledge andIndicators = changes in attitudes, knowledge and
behaviors that show progress towards meetingbehaviors that show progress towards meeting
goalgoal
 Benchmarks = amount of change expected in aBenchmarks = amount of change expected in a
given project yeargiven project year
Mini-EvaluationMini-Evaluation
 Working alone or in groups, fill out theWorking alone or in groups, fill out the
sample evaluation form for your programsample evaluation form for your program
 Focus first on what you are certain of –Focus first on what you are certain of –
include details where knowninclude details where known
 Focus next on what you don’t know – howFocus next on what you don’t know – how
would you measure or find out? Whowould you measure or find out? Who
would you ask and how?would you ask and how?
 Can you make recommendations?Can you make recommendations?
Research vs EvaluationResearch vs Evaluation
 Production of generalizableProduction of generalizable
knowledge is the goalknowledge is the goal
 Researcher-derivedResearcher-derived
questions (hypothesis)questions (hypothesis)
 Paradigm stance (how orParadigm stance (how or
why does it work?)why does it work?)
 Setting designed to testSetting designed to test
hypothesis (controlled)hypothesis (controlled)
 Clearer role (principalClearer role (principal
investigator)investigator)
 PublishedPublished
 Clearer allegianceClearer allegiance
 May produce generalizableMay produce generalizable
knowledgeknowledge
 Program- or funder-derivedProgram- or funder-derived
questionsquestions
 Judgmental quality (does itJudgmental quality (does it
work?)work?)
 Action setting to provideAction setting to provide
direct servicesdirect services
 Role conflicts (projectRole conflicts (project
director AND evaluator)director AND evaluator)
 Often unpublishedOften unpublished
 Multiple allegiancesMultiple allegiances
Adapted from: Maberry Consulting Group
Formative EvaluationFormative Evaluation
 Project proceeding as planned?Project proceeding as planned?
 Activities, recruitment, data collection takingActivities, recruitment, data collection taking
place?place?
 All partners working together?All partners working together?
 How can or should project get back on track?How can or should project get back on track?
 Is project getting closer to its anticipated goals?Is project getting closer to its anticipated goals?
Summative EvaluationSummative Evaluation
 Did the project meet its goals or not?Did the project meet its goals or not?
 Did the participants benefit?Did the participants benefit?
 To what extent did they benefit?To what extent did they benefit?
 Can you prove that activities caused theCan you prove that activities caused the
benefits?benefits?
 What activities were most effective or cost-What activities were most effective or cost-
effective?effective?
 Is the project sustainable?Is the project sustainable?
 Is the project replicable?Is the project replicable?
Short vs Long-term OutcomesShort vs Long-term Outcomes
OUTCOMES
What results for individuals, families, communities..…
SHORT
Learning
Changes in:
• Awareness
• Knowledge
• Attitudes
• Skills
• Opinion
• Aspirations
• Motivation
• Behavioral intent
MEDIUM
Action
Changes in:
•Behavior
•Decision-making
•Policies
•Social action
LONG-TERM
Conditions
Changes in:
•Well-being
•Health
•Economy
•Civics
•Environment
Adapted from: University of Wisconsin – Logic Models
The Four StandardsThe Four Standards
 UtilityUtility: Who needs the information and: Who needs the information and
what information do they need?what information do they need?
 FeasibilityFeasibility: How much money, time, and: How much money, time, and
effort can we put into this?effort can we put into this?
 Propriety:Propriety: What steps need to be takenWhat steps need to be taken
for the evaluation to be ethical?for the evaluation to be ethical?
 Accuracy:Accuracy: What design will lead toWhat design will lead to
accurate information?accurate information?
Adapted from: Maberry Consulting Group
Six methods to collect information and data…Six methods to collect information and data…
Method Overall Purpose Advantages Challenges
1. Questionnaires,
Surveys,
and Checklists
To quickly and/or
easily get a lot of
information from
people in a non-
threatening way
• Can be completed
anonymously
• Inexpensive to
administer
• Easy to compare
and analyze
• Administer to
many people
• Can get lots of
data
• Many sample
questionnaires
• Might not get
careful feedback
• Wording can bias
responses
• Are impersonal
• In surveys, may
need sampling
expert
• Doesn't get the full
story
Adapted from: Maberry Consulting Group
Six methods (continued)…Six methods (continued)…
Method Overall Purpose Advantages Challenges
2. Interviews To fully understand
participant's
impressions or
experiences, or to
learn more about their
answers to
questionnaires
• Collects a full
range and depth of
information
• Develops
relationship with
the participant
• Can be flexible
and responsive
• May take too long
• Can be hard to
analyze and
compare
• Can be costly
• Interviewer can
bias responses
Adapted from: Maberry Consulting Group
Six methods (continued)…Six methods (continued)…
Method Overall Purpose Advantages Challenges
3. Documentation
review
To get an impression of
how program operates
without interrupting the
program; review of
applications, finances,
memos, minutes,
participant work
products, and so on
• Collect
comprehensive and
historical
information
• Doesn't interrupt
program or
participant’s routine
• Information already
exists
• Few biases about
information
• Often time-
consuming
• Information may be
incomplete
• Need to be quite
clear about what you
are looking for
• Not flexible, as data
restricted to what
already exists
Adapted from: Maberry Consulting Group
Six methods (continued)…Six methods (continued)…
Method Overall Purpose Advantages Challenges
4. Observation To gather accurate
information about
how a program
actually operates,
particularly about
processes
• View operations of
a program as they
are actually
occurring
• Can adapt to
events as they
occur
• Difficult to
interpret observed
behaviors
• Complex to
categorize
observations
• Can influence
behaviors of
program
participants
• Can be expensive
Adapted from: Maberry Consulting Group
Six methods (continued)…Six methods (continued)…
Method Overall Purpose Advantages Challenges
5. Focus groups To explore a topic in
depth through group
discussion, such as
reactions to an
experience or
suggested change,
understanding
common complaints
(useful in evaluation
and marketing)
• Quickly and
reliably get
common
impressions
• Can be an efficient
way to get a range
and depth of
information in a
short time
• Can convey key
information about
programs
• Hard to analyze
responses
• Need a good
facilitator for
safety and closure
• Difficult to
schedule 6-8
people together
Adapted from: Maberry Consulting Group
Six methods (continued)…Six methods (continued)…
Method Overall Purpose Advantages Challenges
6. Case studies To fully understand or
depict participant
experiences in a
program, and conduct
comprehensive
examination through
cross comparison of
cases
• Fully depicts
participant’s
experience in
program input,
process and results
• Powerful means to
portray program to
outsiders
• Usually quite time
consuming to
collect, organize
and describe
• Represents depth
of information,
rather than breadth
Adapted from: Maberry Consulting Group
Basic analysis of quantitative (numeric) dataBasic analysis of quantitative (numeric) data
1.1. Make copies of your data and store the original. UseMake copies of your data and store the original. Use
the copy for making edits, cutting and pasting, and sothe copy for making edits, cutting and pasting, and so
onon
2.2. Tabulate the information in one or more spreadsheets.Tabulate the information in one or more spreadsheets.
3.3. Examine the median, mean, and distribution (range) ofExamine the median, mean, and distribution (range) of
data sets; think about which is/are more meaningfuldata sets; think about which is/are more meaningful
4.4. Think about segmentation: Do you have sub-Think about segmentation: Do you have sub-
populations based on demographics or how muchpopulations based on demographics or how much
service was received?service was received?
5.5. Consider confounding variables: Is the effect due to aConsider confounding variables: Is the effect due to a
bias rather than your service or intervention?bias rather than your service or intervention?
Adapted from: Maberry Consulting Group
Basic analysis of qualitative (subjective) dataBasic analysis of qualitative (subjective) data
(interview or focus group responses, or written commentary on questionnaires)(interview or focus group responses, or written commentary on questionnaires)
1.1. Read through all the data.Read through all the data.
2.2. Organize comments into similar categories:Organize comments into similar categories:
concerns, suggestions, strengths, weaknesses,concerns, suggestions, strengths, weaknesses,
similar experiences, program inputs,similar experiences, program inputs,
recommendations, outputs, or outcome indicators.recommendations, outputs, or outcome indicators.
3.3. Look for patterns, associations and possible causalLook for patterns, associations and possible causal
relationships.relationships.
4.4. Keep all raw data as recommended by the funderKeep all raw data as recommended by the funder
after completion in case needed for futureafter completion in case needed for future
reference.reference.
Adapted from: Maberry Consulting Group
Evaluation is not only IndicatorsEvaluation is not only Indicators
 Evaluation: Indicators in the context of:Evaluation: Indicators in the context of:

Design & implementationDesign & implementation

Program relevanceProgram relevance

EfficiencyEfficiency

EffectivenessEffectiveness

ImpactImpact

SustainabilitySustainability
ExperiencesExperiences
 Write down the worst thing that anWrite down the worst thing that an
evaluator ever said to you about yourevaluator ever said to you about your
project…project…
 ……or the worst thing that ever went wrongor the worst thing that ever went wrong
on a grant-funded project at youron a grant-funded project at your
organization!organization!
WHAT?WHAT?
 WhatWhat do we do when something goesdo we do when something goes
wrong?wrong?

Partners don’t get alongPartners don’t get along

Leadership issuesLeadership issues

Disagreements over data interpretationDisagreements over data interpretation

Non-performance by partner(s)Non-performance by partner(s)

Change in conditionsChange in conditions

Conflicts of interestConflicts of interest
The Challenge of Being anThe Challenge of Being an
Evaluator (Sirotnik, 1999)Evaluator (Sirotnik, 1999)
If you…are looking for highly definitive,If you…are looking for highly definitive,
generalizable, cause/effect relationshipsgeneralizable, cause/effect relationships
between measurable, high-validity, independentbetween measurable, high-validity, independent
and dependent variables, then I suggest you goand dependent variables, then I suggest you go
to work for a pharmaceutical company…to work for a pharmaceutical company…
However, if you have a high tolerance forHowever, if you have a high tolerance for
ambiguity and like to mess around…withambiguity and like to mess around…with
contexts that are always colliding andcontexts that are always colliding and
changing…in these settings that are engaged inchanging…in these settings that are engaged in
long-term processes of renewal and change –long-term processes of renewal and change –
well then have I got a career for you!well then have I got a career for you!
If the evaluator ain’t happy, ain’t noIf the evaluator ain’t happy, ain’t no
one happy…one happy…
 If your project isn’t going well, it is theIf your project isn’t going well, it is the
evaluator’s job to tell you (and ultimately toevaluator’s job to tell you (and ultimately to
tell the funder)tell the funder)
 If you have an issue with your evaluator,If you have an issue with your evaluator,
first look at and fix issues with your projectfirst look at and fix issues with your project
 Most of the time, this will repair yourMost of the time, this will repair your
relationship with the evaluator, or at leastrelationship with the evaluator, or at least
make it tolerable!make it tolerable!
Conflict of PartnersConflict of Partners
 Partners have conflicting ideasPartners have conflicting ideas
 One or more partners are notOne or more partners are not
carrying through with activitiescarrying through with activities
or data collectionor data collection
 Personality clashesPersonality clashes
 Complaints about distribution ofComplaints about distribution of
resources or other complaintsresources or other complaints
Leadership IssuesLeadership Issues
 Poor communication – no contact, badPoor communication – no contact, bad
emails, late information and noticesemails, late information and notices
 Bills not paid on time, delays in processingBills not paid on time, delays in processing
registrations for activitiesregistrations for activities
 Bad record-keeping or data collectionBad record-keeping or data collection
 Failure to respond to formative evaluationFailure to respond to formative evaluation
 Mismanagement of project resourcesMismanagement of project resources
 Illegal activityIllegal activity
Different Interpretations of DataDifferent Interpretations of Data
 Evaluator sees the dataEvaluator sees the data
one way, project partnersone way, project partners
see it anothersee it another
 Lack of data leads toLack of data leads to
subjective assessmentsubjective assessment
 One party rejects dataOne party rejects data
Non Performance of PartnersNon Performance of Partners
 Failure to complete scope of servicesFailure to complete scope of services
 Communication issuesCommunication issues
 Significant, ongoing delaysSignificant, ongoing delays
 Personality clashes or personnel problemPersonality clashes or personnel problem
 Undermining professional relationshipsUndermining professional relationships
 Mismanagement of fundsMismanagement of funds
Change in ConditionsChange in Conditions
 New competitor (threat)New competitor (threat)
 Change in lawChange in law
 Sub-contractor or partner goes out ofSub-contractor or partner goes out of
business or quitsbusiness or quits
 Significant economic downturnSignificant economic downturn
 Change in leadership (project orChange in leadership (project or
organization)organization)
Conflict of InterestConflict of Interest
 Is the evaluator independent enough toIs the evaluator independent enough to
objectively evaluate the project?objectively evaluate the project?
 Would giving a positive or negativeWould giving a positive or negative
evaluation result in financial gain?evaluation result in financial gain?

Future contracts with one or more partnersFuture contracts with one or more partners
 Is there an appearance of conflict ofIs there an appearance of conflict of
interest?interest?

Employer/employee or family relationshipEmployer/employee or family relationship
Presentation Wrap UpPresentation Wrap Up
 Additional stories and comments on theAdditional stories and comments on the
evaluation processevaluation process
 Questions on evaluationQuestions on evaluation
 Reference materials for furtherReference materials for further
investigation and information on workinginvestigation and information on working
with an evaluator and the evaluationwith an evaluator and the evaluation
processprocess
Additional ReadingAdditional Reading
 Evaluating Professional DevelopmentEvaluating Professional Development, Thomas, Thomas
R. Guskey, ISBN 0-7619-7561-6, 2000.R. Guskey, ISBN 0-7619-7561-6, 2000.
 ““8 Smooth Steps: Steps to Your own8 Smooth Steps: Steps to Your own
Evaluation,” Joellen Killion,Evaluation,” Joellen Killion, Journal of theJournal of the
National Staff Development CouncilNational Staff Development Council, p 14, Fall, p 14, Fall
2003.2003.
 Practicing Evaluation: A Collaborative ApproachPracticing Evaluation: A Collaborative Approach,,
Rita O’Sullivan, ISBN 0-7619-2456-5, 2004.Rita O’Sullivan, ISBN 0-7619-2456-5, 2004.
 W.K. Kellogg Foundation Evaluation HandbookW.K. Kellogg Foundation Evaluation Handbook
& Development Guide,& Development Guide, www.wkkf.orgwww.wkkf.org, 2008., 2008.
Thank you for attending!Thank you for attending!
 Contact information:Contact information:

Lisa Glickstein:Lisa Glickstein: lglickstein@aps1.netlglickstein@aps1.net

Marjorie Dennis:Marjorie Dennis: marjorie_dennis@uml.edumarjorie_dennis@uml.edu

Taking the Mystery out of Project Evaluation

  • 1.
    Taking the Mysteryout ofTaking the Mystery out of Project EvaluationProject Evaluation Marjorie Dennis, UMass LowellMarjorie Dennis, UMass Lowell Lisa Glickstein PhDLisa Glickstein PhD Andover Public SchoolsAndover Public Schools
  • 2.
    IntroductionsIntroductions  Marjorie Dennis:NSF GK-12, RET; US EDMarjorie Dennis: NSF GK-12, RET; US ED TAH, PT3; State ITQ, STEM PipelineTAH, PT3; State ITQ, STEM Pipeline  Lisa Glickstein: US ED TAH, PEP; StateLisa Glickstein: US ED TAH, PEP; State STEM PipelineSTEM Pipeline  Participant ExperiencesParticipant Experiences
  • 3.
    Googlisms!Googlisms!  Evaluation is…Evaluationis…  …”…”used to manage a project to meetused to manage a project to meet objectives.”objectives.”  …”…”the process of measuring the quality ofthe process of measuring the quality of performance against standards toperformance against standards to determine if the standards have beendetermine if the standards have been met.”met.”  …”…”a systematic inquiry designed toa systematic inquiry designed to provide information to decision makers.”provide information to decision makers.”
  • 4.
    Session OverviewSession Overview WhyWhy do you evaluate?do you evaluate?  WhoWho chooses the evaluator?chooses the evaluator?  WhereWhere do you find an evaluator?do you find an evaluator?  WhenWhen do you start working with thedo you start working with the evaluator?evaluator?  HowHow do you design and carry out ando you design and carry out an evaluation?evaluation?  WhatWhat do you do if something goes wrong?do you do if something goes wrong?
  • 5.
    WHY?WHY?  Because youhave to – federal or state lawBecause you have to – federal or state law  To get feedback during the projectTo get feedback during the project  Formative evaluationFormative evaluation  Because you want to answer the bigBecause you want to answer the big question:question: DID IT WORK?DID IT WORK?  Summative evaluationSummative evaluation
  • 6.
    ““Research seeks toResearchseeks to proveprove,, evaluation seeks toevaluation seeks to improveimprove…”…” M.Q. PattonM.Q. Patton (former President of the American Evaluation Association)(former President of the American Evaluation Association) Adapted from: Maberry Consulting Group
  • 7.
    WHO?WHO?  Who isa good evaluator?Who is a good evaluator?  Who selects the evaluator?Who selects the evaluator?  Who decides contract terms?Who decides contract terms?  Who will the evaluator report to?Who will the evaluator report to?  Who will the evaluator work with?Who will the evaluator work with?  Who owns the evaluation plan, data andWho owns the evaluation plan, data and reports?reports?
  • 8.
    Internal versus ExternalEvaluationInternal versus External Evaluation  Program guidelinesProgram guidelines  CostCost  AvailabilityAvailability  Existing RelationshipsExisting Relationships
  • 9.
    Four Steps toSelect an EvaluatorFour Steps to Select an Evaluator 1.1. Decide who selects the evaluatorDecide who selects the evaluator  Project Director or Advisory BoardProject Director or Advisory Board  Head of organizationHead of organization  Grant or Business Office (lowest bidder)Grant or Business Office (lowest bidder)  Grant-makerGrant-maker 1.1. Contact potential evaluators and solicitContact potential evaluators and solicit proposalsproposals
  • 10.
    Four Steps (cont’d)FourSteps (cont’d) 3.3. Choose selection criteria (see Notes)Choose selection criteria (see Notes)  KnowledgeKnowledge  Experience – overall, and project-specificExperience – overall, and project-specific  CostCost  ReferencesReferences  Communication skillsCommunication skills 3.3. Evaluate proposals and selectEvaluate proposals and select
  • 11.
    Intellectual PropertyIntellectual Property The funder paid for the evaluation and canThe funder paid for the evaluation and can request data or reports at any timerequest data or reports at any time  The fiscal agent/sponsor retains the hardThe fiscal agent/sponsor retains the hard and electronic copies of all evaluationand electronic copies of all evaluation materialsmaterials  Evaluation plan, rubrics, data and reportsEvaluation plan, rubrics, data and reports all belong to the project (not to theall belong to the project (not to the evaluator)evaluator)
  • 12.
    WHEREWHERE??  WhereWhere doyou find an evaluator?do you find an evaluator?  In houseIn house  ExternalExternal  WhereWhere does one learn to be an evaluator?does one learn to be an evaluator?  WhereWhere are evaluators certified?are evaluators certified?  WhereWhere do external evaluators work?do external evaluators work?
  • 13.
    Professional EvaluatorsProfessional Evaluators American Evaluation Association (American Evaluation Association ( http://www.eval.org/http://www.eval.org/))  Graduate & Certificate Programs (Graduate & Certificate Programs ( http://www.eval.org/p/cm/ld/fid=43http://www.eval.org/p/cm/ld/fid=43))  There is no single recognized credentialThere is no single recognized credential  Consulting or in a university setting (Consulting or in a university setting ( http://www.eval.org/p/cm/ld/fid=108http://www.eval.org/p/cm/ld/fid=108))
  • 14.
    WHEN?WHEN?  WhenWhen doyou select the evaluator?do you select the evaluator?  WhenWhen do you work out terms of hire?do you work out terms of hire?  WhenWhen do you have the first projectdo you have the first project meeting?meeting?  WhenWhen and how often will you meet?and how often will you meet?  WhenWhen do you expect reports?do you expect reports?
  • 15.
    Initial ContactInitial Contact Involving the evaluator in project planning, orInvolving the evaluator in project planning, or planning evaluation during (not after) projectplanning evaluation during (not after) project design is a good ideadesign is a good idea  An evaluator can prepare an evaluation plan forAn evaluator can prepare an evaluation plan for a proposal (usually at no cost)a proposal (usually at no cost)  Sometimes it wins points to name an evaluatorSometimes it wins points to name an evaluator in the grant proposalin the grant proposal  Be aware of bid laws in your stateBe aware of bid laws in your state  Post-award some changes to the evaluation planPost-award some changes to the evaluation plan are okayare okay
  • 16.
    Contract and TermsContractand Terms  Scope of services – list of activities andScope of services – list of activities and work products (see Notes)work products (see Notes)  Cost of entire contract, or hourly rate andCost of entire contract, or hourly rate and estimated total (8-15% of total grant is toestimated total (8-15% of total grant is to be expected)be expected)  Automatic termination if grant ends,Automatic termination if grant ends, options to terminate for cause or no causeoptions to terminate for cause or no cause  Protections for organization from failuresProtections for organization from failures or omissions on part of evaluatoror omissions on part of evaluator
  • 17.
    HOW?HOW?  How doyou design and carry out anHow do you design and carry out an evaluation?evaluation?  Create evaluation planCreate evaluation plan  Select indicators and benchmarksSelect indicators and benchmarks  How do you collect and analyze data?How do you collect and analyze data?  Analyze dataAnalyze data  Write reportsWrite reports
  • 18.
    Evaluation PlanEvaluation Plan Project’s goals, activities and timeline includeProject’s goals, activities and timeline include data collectiondata collection  Goal = ultimate change project intends to make inGoal = ultimate change project intends to make in a condition (e.g. poverty)a condition (e.g. poverty)  Indicators = changes in attitudes, knowledge andIndicators = changes in attitudes, knowledge and behaviors that show progress towards meetingbehaviors that show progress towards meeting goalgoal  Benchmarks = amount of change expected in aBenchmarks = amount of change expected in a given project yeargiven project year
  • 19.
    Mini-EvaluationMini-Evaluation  Working aloneor in groups, fill out theWorking alone or in groups, fill out the sample evaluation form for your programsample evaluation form for your program  Focus first on what you are certain of –Focus first on what you are certain of – include details where knowninclude details where known  Focus next on what you don’t know – howFocus next on what you don’t know – how would you measure or find out? Whowould you measure or find out? Who would you ask and how?would you ask and how?  Can you make recommendations?Can you make recommendations?
  • 20.
    Research vs EvaluationResearchvs Evaluation  Production of generalizableProduction of generalizable knowledge is the goalknowledge is the goal  Researcher-derivedResearcher-derived questions (hypothesis)questions (hypothesis)  Paradigm stance (how orParadigm stance (how or why does it work?)why does it work?)  Setting designed to testSetting designed to test hypothesis (controlled)hypothesis (controlled)  Clearer role (principalClearer role (principal investigator)investigator)  PublishedPublished  Clearer allegianceClearer allegiance  May produce generalizableMay produce generalizable knowledgeknowledge  Program- or funder-derivedProgram- or funder-derived questionsquestions  Judgmental quality (does itJudgmental quality (does it work?)work?)  Action setting to provideAction setting to provide direct servicesdirect services  Role conflicts (projectRole conflicts (project director AND evaluator)director AND evaluator)  Often unpublishedOften unpublished  Multiple allegiancesMultiple allegiances Adapted from: Maberry Consulting Group
  • 21.
    Formative EvaluationFormative Evaluation Project proceeding as planned?Project proceeding as planned?  Activities, recruitment, data collection takingActivities, recruitment, data collection taking place?place?  All partners working together?All partners working together?  How can or should project get back on track?How can or should project get back on track?  Is project getting closer to its anticipated goals?Is project getting closer to its anticipated goals?
  • 22.
    Summative EvaluationSummative Evaluation Did the project meet its goals or not?Did the project meet its goals or not?  Did the participants benefit?Did the participants benefit?  To what extent did they benefit?To what extent did they benefit?  Can you prove that activities caused theCan you prove that activities caused the benefits?benefits?  What activities were most effective or cost-What activities were most effective or cost- effective?effective?  Is the project sustainable?Is the project sustainable?  Is the project replicable?Is the project replicable?
  • 23.
    Short vs Long-termOutcomesShort vs Long-term Outcomes OUTCOMES What results for individuals, families, communities..… SHORT Learning Changes in: • Awareness • Knowledge • Attitudes • Skills • Opinion • Aspirations • Motivation • Behavioral intent MEDIUM Action Changes in: •Behavior •Decision-making •Policies •Social action LONG-TERM Conditions Changes in: •Well-being •Health •Economy •Civics •Environment Adapted from: University of Wisconsin – Logic Models
  • 24.
    The Four StandardsTheFour Standards  UtilityUtility: Who needs the information and: Who needs the information and what information do they need?what information do they need?  FeasibilityFeasibility: How much money, time, and: How much money, time, and effort can we put into this?effort can we put into this?  Propriety:Propriety: What steps need to be takenWhat steps need to be taken for the evaluation to be ethical?for the evaluation to be ethical?  Accuracy:Accuracy: What design will lead toWhat design will lead to accurate information?accurate information? Adapted from: Maberry Consulting Group
  • 25.
    Six methods tocollect information and data…Six methods to collect information and data… Method Overall Purpose Advantages Challenges 1. Questionnaires, Surveys, and Checklists To quickly and/or easily get a lot of information from people in a non- threatening way • Can be completed anonymously • Inexpensive to administer • Easy to compare and analyze • Administer to many people • Can get lots of data • Many sample questionnaires • Might not get careful feedback • Wording can bias responses • Are impersonal • In surveys, may need sampling expert • Doesn't get the full story Adapted from: Maberry Consulting Group
  • 26.
    Six methods (continued)…Sixmethods (continued)… Method Overall Purpose Advantages Challenges 2. Interviews To fully understand participant's impressions or experiences, or to learn more about their answers to questionnaires • Collects a full range and depth of information • Develops relationship with the participant • Can be flexible and responsive • May take too long • Can be hard to analyze and compare • Can be costly • Interviewer can bias responses Adapted from: Maberry Consulting Group
  • 27.
    Six methods (continued)…Sixmethods (continued)… Method Overall Purpose Advantages Challenges 3. Documentation review To get an impression of how program operates without interrupting the program; review of applications, finances, memos, minutes, participant work products, and so on • Collect comprehensive and historical information • Doesn't interrupt program or participant’s routine • Information already exists • Few biases about information • Often time- consuming • Information may be incomplete • Need to be quite clear about what you are looking for • Not flexible, as data restricted to what already exists Adapted from: Maberry Consulting Group
  • 28.
    Six methods (continued)…Sixmethods (continued)… Method Overall Purpose Advantages Challenges 4. Observation To gather accurate information about how a program actually operates, particularly about processes • View operations of a program as they are actually occurring • Can adapt to events as they occur • Difficult to interpret observed behaviors • Complex to categorize observations • Can influence behaviors of program participants • Can be expensive Adapted from: Maberry Consulting Group
  • 29.
    Six methods (continued)…Sixmethods (continued)… Method Overall Purpose Advantages Challenges 5. Focus groups To explore a topic in depth through group discussion, such as reactions to an experience or suggested change, understanding common complaints (useful in evaluation and marketing) • Quickly and reliably get common impressions • Can be an efficient way to get a range and depth of information in a short time • Can convey key information about programs • Hard to analyze responses • Need a good facilitator for safety and closure • Difficult to schedule 6-8 people together Adapted from: Maberry Consulting Group
  • 30.
    Six methods (continued)…Sixmethods (continued)… Method Overall Purpose Advantages Challenges 6. Case studies To fully understand or depict participant experiences in a program, and conduct comprehensive examination through cross comparison of cases • Fully depicts participant’s experience in program input, process and results • Powerful means to portray program to outsiders • Usually quite time consuming to collect, organize and describe • Represents depth of information, rather than breadth Adapted from: Maberry Consulting Group
  • 31.
    Basic analysis ofquantitative (numeric) dataBasic analysis of quantitative (numeric) data 1.1. Make copies of your data and store the original. UseMake copies of your data and store the original. Use the copy for making edits, cutting and pasting, and sothe copy for making edits, cutting and pasting, and so onon 2.2. Tabulate the information in one or more spreadsheets.Tabulate the information in one or more spreadsheets. 3.3. Examine the median, mean, and distribution (range) ofExamine the median, mean, and distribution (range) of data sets; think about which is/are more meaningfuldata sets; think about which is/are more meaningful 4.4. Think about segmentation: Do you have sub-Think about segmentation: Do you have sub- populations based on demographics or how muchpopulations based on demographics or how much service was received?service was received? 5.5. Consider confounding variables: Is the effect due to aConsider confounding variables: Is the effect due to a bias rather than your service or intervention?bias rather than your service or intervention? Adapted from: Maberry Consulting Group
  • 32.
    Basic analysis ofqualitative (subjective) dataBasic analysis of qualitative (subjective) data (interview or focus group responses, or written commentary on questionnaires)(interview or focus group responses, or written commentary on questionnaires) 1.1. Read through all the data.Read through all the data. 2.2. Organize comments into similar categories:Organize comments into similar categories: concerns, suggestions, strengths, weaknesses,concerns, suggestions, strengths, weaknesses, similar experiences, program inputs,similar experiences, program inputs, recommendations, outputs, or outcome indicators.recommendations, outputs, or outcome indicators. 3.3. Look for patterns, associations and possible causalLook for patterns, associations and possible causal relationships.relationships. 4.4. Keep all raw data as recommended by the funderKeep all raw data as recommended by the funder after completion in case needed for futureafter completion in case needed for future reference.reference. Adapted from: Maberry Consulting Group
  • 33.
    Evaluation is notonly IndicatorsEvaluation is not only Indicators  Evaluation: Indicators in the context of:Evaluation: Indicators in the context of:  Design & implementationDesign & implementation  Program relevanceProgram relevance  EfficiencyEfficiency  EffectivenessEffectiveness  ImpactImpact  SustainabilitySustainability
  • 34.
    ExperiencesExperiences  Write downthe worst thing that anWrite down the worst thing that an evaluator ever said to you about yourevaluator ever said to you about your project…project…  ……or the worst thing that ever went wrongor the worst thing that ever went wrong on a grant-funded project at youron a grant-funded project at your organization!organization!
  • 35.
    WHAT?WHAT?  WhatWhat dowe do when something goesdo we do when something goes wrong?wrong?  Partners don’t get alongPartners don’t get along  Leadership issuesLeadership issues  Disagreements over data interpretationDisagreements over data interpretation  Non-performance by partner(s)Non-performance by partner(s)  Change in conditionsChange in conditions  Conflicts of interestConflicts of interest
  • 36.
    The Challenge ofBeing anThe Challenge of Being an Evaluator (Sirotnik, 1999)Evaluator (Sirotnik, 1999) If you…are looking for highly definitive,If you…are looking for highly definitive, generalizable, cause/effect relationshipsgeneralizable, cause/effect relationships between measurable, high-validity, independentbetween measurable, high-validity, independent and dependent variables, then I suggest you goand dependent variables, then I suggest you go to work for a pharmaceutical company…to work for a pharmaceutical company… However, if you have a high tolerance forHowever, if you have a high tolerance for ambiguity and like to mess around…withambiguity and like to mess around…with contexts that are always colliding andcontexts that are always colliding and changing…in these settings that are engaged inchanging…in these settings that are engaged in long-term processes of renewal and change –long-term processes of renewal and change – well then have I got a career for you!well then have I got a career for you!
  • 37.
    If the evaluatorain’t happy, ain’t noIf the evaluator ain’t happy, ain’t no one happy…one happy…  If your project isn’t going well, it is theIf your project isn’t going well, it is the evaluator’s job to tell you (and ultimately toevaluator’s job to tell you (and ultimately to tell the funder)tell the funder)  If you have an issue with your evaluator,If you have an issue with your evaluator, first look at and fix issues with your projectfirst look at and fix issues with your project  Most of the time, this will repair yourMost of the time, this will repair your relationship with the evaluator, or at leastrelationship with the evaluator, or at least make it tolerable!make it tolerable!
  • 38.
    Conflict of PartnersConflictof Partners  Partners have conflicting ideasPartners have conflicting ideas  One or more partners are notOne or more partners are not carrying through with activitiescarrying through with activities or data collectionor data collection  Personality clashesPersonality clashes  Complaints about distribution ofComplaints about distribution of resources or other complaintsresources or other complaints
  • 39.
    Leadership IssuesLeadership Issues Poor communication – no contact, badPoor communication – no contact, bad emails, late information and noticesemails, late information and notices  Bills not paid on time, delays in processingBills not paid on time, delays in processing registrations for activitiesregistrations for activities  Bad record-keeping or data collectionBad record-keeping or data collection  Failure to respond to formative evaluationFailure to respond to formative evaluation  Mismanagement of project resourcesMismanagement of project resources  Illegal activityIllegal activity
  • 40.
    Different Interpretations ofDataDifferent Interpretations of Data  Evaluator sees the dataEvaluator sees the data one way, project partnersone way, project partners see it anothersee it another  Lack of data leads toLack of data leads to subjective assessmentsubjective assessment  One party rejects dataOne party rejects data
  • 41.
    Non Performance ofPartnersNon Performance of Partners  Failure to complete scope of servicesFailure to complete scope of services  Communication issuesCommunication issues  Significant, ongoing delaysSignificant, ongoing delays  Personality clashes or personnel problemPersonality clashes or personnel problem  Undermining professional relationshipsUndermining professional relationships  Mismanagement of fundsMismanagement of funds
  • 42.
    Change in ConditionsChangein Conditions  New competitor (threat)New competitor (threat)  Change in lawChange in law  Sub-contractor or partner goes out ofSub-contractor or partner goes out of business or quitsbusiness or quits  Significant economic downturnSignificant economic downturn  Change in leadership (project orChange in leadership (project or organization)organization)
  • 43.
    Conflict of InterestConflictof Interest  Is the evaluator independent enough toIs the evaluator independent enough to objectively evaluate the project?objectively evaluate the project?  Would giving a positive or negativeWould giving a positive or negative evaluation result in financial gain?evaluation result in financial gain?  Future contracts with one or more partnersFuture contracts with one or more partners  Is there an appearance of conflict ofIs there an appearance of conflict of interest?interest?  Employer/employee or family relationshipEmployer/employee or family relationship
  • 44.
    Presentation Wrap UpPresentationWrap Up  Additional stories and comments on theAdditional stories and comments on the evaluation processevaluation process  Questions on evaluationQuestions on evaluation  Reference materials for furtherReference materials for further investigation and information on workinginvestigation and information on working with an evaluator and the evaluationwith an evaluator and the evaluation processprocess
  • 45.
    Additional ReadingAdditional Reading Evaluating Professional DevelopmentEvaluating Professional Development, Thomas, Thomas R. Guskey, ISBN 0-7619-7561-6, 2000.R. Guskey, ISBN 0-7619-7561-6, 2000.  ““8 Smooth Steps: Steps to Your own8 Smooth Steps: Steps to Your own Evaluation,” Joellen Killion,Evaluation,” Joellen Killion, Journal of theJournal of the National Staff Development CouncilNational Staff Development Council, p 14, Fall, p 14, Fall 2003.2003.  Practicing Evaluation: A Collaborative ApproachPracticing Evaluation: A Collaborative Approach,, Rita O’Sullivan, ISBN 0-7619-2456-5, 2004.Rita O’Sullivan, ISBN 0-7619-2456-5, 2004.  W.K. Kellogg Foundation Evaluation HandbookW.K. Kellogg Foundation Evaluation Handbook & Development Guide,& Development Guide, www.wkkf.orgwww.wkkf.org, 2008., 2008.
  • 46.
    Thank you forattending!Thank you for attending!  Contact information:Contact information:  Lisa Glickstein:Lisa Glickstein: lglickstein@aps1.netlglickstein@aps1.net  Marjorie Dennis:Marjorie Dennis: marjorie_dennis@uml.edumarjorie_dennis@uml.edu

Editor's Notes

  • #3 Have you worked with an evaluator? Have you worked with an evaluator on a federal project? State project? Other? Have you worked with more than one evaluator? Have you worked with an evaluator on more than one project? Door prize for most experienced participant
  • #4 evaluation is the key to learning evaluation is key to business work in education evaluation is not necessarily the solution Evaluation is the first step evaluation is vital to continued success evaluation is alive and well evaluation is a subset of research evaluation is a vital part of project planning and management evaluation is essential to management evaluation is provided for advisement purposes evaluation is a valuable tool evaluation is comparative evaluation is used to measure evaluation is needed evaluation is proven evaluation is conducted by a team of qualified evaluation is core to instruction evaluation is continuous evaluation is a vital part of the region staff evaluation is surely one of the oldest traditions evaluation is fuller laziness evaluation is underway evaluation is used to manage a project to meet objectives evaluation is to summarize evaluation is an ongoing process evaluation is the final phase of the site analysis process evaluation is murky evaluation is a single word evaluation is the process of measuring the quality of performance against standards to determine if the standards have been met evaluation is not necessarily the solution evaluation is ambiguous evaluation is vital to continued success evaluation is a vital part of the project planning and management process evaluation is important not only for accountability but for its value in helping us to learn from experience evaluation is essential to the management of your communications program evaluation is ethical and practical evaluation is in the literature evaluation is a vital part of the region staff college curriculum evaluation is completed at the end of the meeting evaluation is an important part of any manager’s duties evaluation is difficult evaluation is the process of determining significance or worth evaluation is strictly for use by the instructor evaluation is a systematic inquiry designed to provide information to decision makers evaluation is tracking the process of development and dissemination evaluation is crucial evaluation is a core mission of university of wisconsin evaluation is the study of the relationship between brain functioning and observable behavior evaluation is closely related to performance management evaluation is accelerated evaluation is only one of several types of data collection used by nsf grantees evaluation is understood
  • #6 Correct, change and adapt along the way What happened and where do we go from here with these outcomes
  • #7 To put it simply, although research and evaluation are very similar and both use systematic methods, their intent and purposes differ. Remember that your intention for doing evaluation is to help improve programs and not to blame or criticize or seek to eliminate something.
  • #10 Sometimes a specific evaluator has been already chosen or specified by the funder. Are you writing the evaluator directly into the grant application and are they supplying you with the evaluation section to include in the application?
  • #11 Sample selection criteria: Comparative Evaluation Criteria: The following comparative evaluation criteria will be used in measuring the relative merits of the proposals that meet the minimum criteria established above. The Technical Proposal shall be reviewed and rated by the Superintendent or her designee(s), the Project Directors. The Technical Proposal shall be reviewed and rated as Highly Advantageous (the Proposal exceeds the specified evaluation standard), Advantageous (the Proposal fully meets the evaluation standard that has been specified), Not Advantageous (the Proposal meets the evaluation standard in a minimal way or lacks minor elements), or Unacceptable (the Proposal does not meet the standard). Personnel: The Evaluator who is able to demonstrate a high degree of staff education and relevant experience in American History and project evaluation in public education will be rated more favorably. An Evaluator whose staff demonstrates highly professional qualifications and strong experience in both education and American History will be rated as Highly Advantageous and those whose staff demonstrates highly professional qualifications and strong experience in education but not American History as Advantageous. An Evaluator whose staff has minimal qualifications and experience in both subject areas would be rated as Not Advantageous, and those with staff with no relevant experience assigned to this project would be rated as Unacceptable. Task Expertise: The Evaluator should be able to demonstrate considerable work experience specifically as it applies to the tasks outline in the Scope of Services. An Evaluator who is able to list five (5) or more Teaching American History evaluations will be rated Highly Advantageous. The Evaluator who is able to list more than one Teaching American History evaluation and more than one evaluation in another federal Department of Education project area will be rated Advantageous. An Evaluator that has performed only one Teaching American History evaluation would be rated as Not Advantageous. Evaluators with no relevant project experience will be rated as Unacceptable. Proposal Content: The proposal narrative should be responsive to the tasks specified in the Scope of Services. The Evaluator producing a well-written proposal containing an effective plan to deliver these services will be rated Highly Advantageous; a well-written proposal containing a satisfactory plan will be rated as Advantageous; a proposal that is not well-written but addresses the full Scope of Services will be rated as Not Advantageous; a proposal that does not address the full Scope of Services will be rated Unacceptable.
  • #12 Which entity owns the data and the reports that emanate from the project?
  • #16 Communicate with program officer!!
  • #17 Sample Scope of Services: The Evaluator shall provide the following services to the Sample Public Schools: Participate on the project’s Advisory Board Assist project to develop rubrics for assessing project performance Gather data throughout the project from participant pre and post-intervention tests, in-year seminars, participant-created lesson plans, and the summer institutes Analyze data against relevant indicator benchmark rubrics; these findings will be used to formatively guide project implementation Using questions based upon the Massachusetts Comprehensive Assessment System (MCAS) History test for grades 5/7 and 10/11, develop a brief test to be taken by all participants at the beginning and end of their project year Interact with teacher participants both within the context of project activities (seminars, workshops, etc.) and also in classrooms while participants teach content using project-developed materials Assess lessons, curriculum materials, and website materials developed by participants Where possible and applicable, assess standardized student performance data. Collect participant pre-/post-intervention test data annually (at the beginning and end of participant engagement); collect other data such as observations, surveys, interviews, on an on-going basis throughout the project year as necessary Share formative data reports regularly with project staff Gather, analyze, and summatively report data in conjunction with the annual scoring of project performance against its benchmarked performance indicators No cost extension terms in the agreement.
  • #19 EHR/NSF Evaluation Handbook (2003) This is where we are going to have the first activity: mini evaluation of your project
  • #20 Questions: What did you learn? What do you still need to know/understand?
  • #22 EHR/NSF Evaluation Handbook (2003) Course correction mechanism
  • #23 EHR/NSF Evaluation Handbook (2003) Now that we are done, are there next steps and what are they?
  • #24 University of Wisconsin – Logic Models Outputs are not outcomes!
  • #25 Who needs the evaluation results? What do they need? Will the evaluation provide relevant, or otherwise useful, information in timely manner for the users? Often the best way to answer these questions is to ask your stakeholders and provide them with the opportunity to be part of the evaluation. Feasibility is another standard to keep in mind while planning and conducting evaluation. The purpose of this standard is to ensure that the evaluation conducted or to be conducted is realistic, prudent, diplomatic, and frugal. Applying this standard will help us to find focus and channel the resources to the parts or activities of the program that most need improvement. Propriety ensures that those of us involved in the evaluation, as evaluators or members of the evaluation team, behave legally, ethically, and with due regard for the welfare of those involved and affected by the program and its evaluation. Do you need to go through an Institutional Review Board (IRB)? Does the evaluation or methods employed in conducting evaluation protect the rights of individuals and the welfare of those involved? Does the evaluation engage those directly affected by the program as well as those affected by potential changes in the program (i.e., program participants, patients or the surrounding community)? Have we included people in the evaluation process and allowed them to have their own voices in making decisions that may affect their health and well being? Accuracy is the 4th Program Evaluation standard. In planning evaluation it’s important to identify ways or methods that would provide us with the most valid and reliable results
  • #33 Patterns examples: all participants in evening programs had similar concerns; most participants were drawn from the same geographic area or income demographic; effect of participant experience on learning outcomes; or processes and events that participants were most likely to remember
  • #35 Activity Prize opportunity!! The project started slow due to scheduling and technology issues within the school district. Since [district] is not wireless, monitoring of student activities was difficult to manage. Difficulty recruiting teachers from high-need districts. This problem is often compounded by the fact that school administrators are reluctant to allow teachers to miss class and attend trainings since history is not deemed a high-priority issue (since there is no longer a state test). There is some tension between [district] and its partner [university], mostly over logistical concerns. The evaluators note that the teacher content knowledge tests were taken on the same day, and would be more accurate if they were taken a semester apart to assess real knowledge acquisition. The U.S. Department of Education requires TAH projects to report the number of “completers” each year. Unfortunately, this definition and requirement was introduced after the Leadership in America project proposal was written and funded. Thus this project did not have completers by the current US Department of Education definition. The evaluators feel that the findings related to questionable gains on teacher content tests, recruitment, the integration of pedagogy and content, and difficulties encountered in tracking teacher participation are tied to the hurdles the project faced in its organization.
  • #39 Appeal to a higher authority – within project or organization, or professional mediation if necessary Document issues – incident log, financial statements Take turns and try to see the other party’s point of view Communicate face-to-face or by phone – be careful what you put in writing!
  • #40 Go to organization leaders if project leader is the issue Maintain incident log – document concerns Whistle-blower protection if you suspect illegal activity
  • #41 Collect more data – resolve other conflicts May be a leadership problem or partner conflict in disguise
  • #42 Always pay on a cost reimbursement basis! Fall back on scope of services and contract Document issues May require legal assistance
  • #43 Communicate with grant maker!! Make clear in evaluation report that these issues are beyond your control and not your fault! But what are you going to do about them? Have a plan of action.
  • #44 MUST be resolved!!! Appeal to the program officer if necessary
  • #47 We hoped you learned something today Please feel free to follow up with either of us.