Crafting Your Project’s
Evaluation Plan

Presentation to Hands-on Proposal
Development Workshop
June 18, 2009

June 18, 20...
Agenda
•
•
•
•
•
•

Goals and Objectives
Evaluation Definition and Types
Writing Your Evaluation Plan
Data Analysis and Re...
Goals and Objectives

June 18, 2009

3
Goals/Objectives
• The most important element of a
successful program is the development of
attainable goals and measurabl...
Goals: Characteristics
• Describe the overall purpose of the
program
• Describe broad outcomes and concepts
(what we want ...
Goals: Development Steps
•
•
•
•

Research the topic (define needs)
Involve stakeholders (gains commitment)
Brainstorm goa...
Goals: Samples
• The NLM databases will become an integral
component of the institution’s public health
department instruc...
Objectives
• Specifically state how the goals will be
achieved
• Are measurable: Define what you want to
see
• Encourage a...
Objectives Are Not…
Tasks
• Conducting a training session is a task.
– Poor objective: We will conduct a training
session
...
How to be SMART

June 18, 2009

Do not reproduce without permission

10
SMART Objectives
• Specific: Be precise about what you are
going to achieve
• Measurable: Quantify the objectives
• Approp...
SMART: Specific Objectives
Specific: Be precise about what you are going to
achieve
–
–
–
–
–
–

Specify target
Specify in...
S

MART: Measurable Objectives

Measurable: Quantify the objectives
– Use measures as indicators of program success
– If p...
ART: Appropriate Objectives

SM

Appropriate: Align with the needs of the target audience
–
–
–
–

Meeting the objective w...
SMA

RT: Realistic Objectives

Realistic : Do you have the resources to make
the objective happen?
–
–
–
–

Are important ...
T: Time-Specific Objectives

SMAR

Time-Specific: State when you will achieve the objective
– Provide timeframe indicating...
Goals and Objectives
Objective One

Goal

Objective Two

Objective Three

Maintain a clear connection between your goals a...
Goals and Objectives
Final Note:
– The Goals and Objectives communicate your intended
results
– Know your story. When stak...
SMART Tool

June 18, 2009

Do not reproduce without permission

19
SMART Tool
Goal: The National Library of Medicine’s databases will become an
integral component of the institution’s publi...
SMART Benefits and Costs

June 18, 2009

Do not reproduce without permission

21
Benefits
•
•
•
•

Facilitates communication with program stakeholders
Informs on what data should be collected
Enables eff...
Costs and Limitations
• Impression that creativity is limited
• Time-consuming
• GI/GO
• Encourages too great a focus on d...
Comment on Metrics
• A well-written objective suggests the metric(s)
• Example:
– By January 2010, all students in the Pub...
Evaluation: Definitions and
Types

June 18, 2009

25
Evaluation
What is it? Why do I HAVE to do it??
We will:
– Define Evaluation
– Introduce Evaluation Types
– Review Why Eva...
Evaluation
What is it?
Evaluation…
– Assesses program achievements or progress
– Enables a data-based judgment on program
...
Evaluation
What kinds are there?
– Monitoring: Keeping track of what the program is
doing
– Formative: Progress towards me...
Evaluation
Why do I have to evaluate?
– Provides systematic information to
continually improve your program
– When people ...
Evaluation Types
1. Process Monitoring
2. Formative evaluation
3. Summative evaluation
•
•

Outputs
Outcomes

June 18, 200...
Process Monitoring
What are we doing?
– This evaluation type is a continuous activity
• Is the data being captured in an o...
Formative Evaluation
Are we making progress towards meeting
our goals? Are mid-course corrections
needed?
– Occurs during ...
Summative Evaluation
What impact did we have? What
were our Outputs and Outcomes
– Occurs at conclusion of program
•
•
•
•...
Summative Evaluation
What are outputs and outcomes?
– Outputs are the program results that can be quantified
(e.g. the num...
Writing Your Evaluation Plan

June 18, 2009

35
Evaluation Plan
• Think about and plan for how you will do
evaluation now.
• If you don’t think about evaluation now, you
...
Evaluation Plan Components
Your Evaluation Plan Should Have these
Components:
– Evaluation Questions
– Methodology
• Proce...
Evaluation Questions
Similar to how goals and objectives guide
program development, Evaluation Questions
guide program ass...
Evaluation Questions
What are they?
– A good evaluation question specifically outlines
what is being assessed and suggests...
Evaluation Questions
How do I develop evaluation questions?
– The first step is to refer back to the SMART objectives. If
...
Evaluation Plan
I have evaluation questions… what do I do
now?
– Develop a methodology
– What data and data sources will y...
Evaluation Plan
Methodology
• Outline your steps by Evaluation Stage
– Process Monitoring: In the first month of
the grant...
Evaluation Plan
• An evaluation crosswalk defines
what data sources will be used to
inform on the evaluation question.
Cou...
Data Analysis and Reporting

June 18, 2009

44
Data Analysis
I have data… what do I do now?
We will review
– Data types
– Analysis

June 18, 2009

Do not reproduce witho...
Data Analysis
Data Types: Quantitative Data
– Measurable and tangible
– Involves the counting of people, behaviors,
condit...
Data Analysis
Steps
– Understand your data!
• Be able to explain what the numbers mean
– Organize the data
• Enter the dat...
Data Analysis
Tips
– Make it simple! It is not effective when no
one understands your results
– Increase the white space! ...
Data Analysis
Data Types: Qualitative Data
– Data is rich in detail and description
– Text or narrative format
– Examples:...
Data Analysis
Steps
– Organize the data
• Enter data into a program like Excel
– “Clean” the data
• Read the data. Correct...
Data Analysis
Tips
– Maintain objectivity. Recognize and limit your
biases
– Where possible, quantify the qualitative data...
Data Analysis
Overall Tips
– Remember: The data is used to answer the
evaluation questions
– Quantitative data and qualita...
Reporting
Tell your story!
– Report Sections
– Tips
– Understand Required Reporting

June 18, 2009

Do not reproduce witho...
Reporting
Report Sections
– Program Background: What are the
program goals and objectives
– Program Activities: What did y...
Reporting
Report Sections
– Results: List the results from your data collection
instruments
– Analysis/Conclusions: What d...
Reporting
Tips
– You are doing important things. Do not brag but
do not minimize your accomplishments
– Be honest… if the ...
Reporting
Required Reporting
– Funders often have reporting requirements
– Request that the funder defines data that will ...
Sample Evaluation Plan

June 18, 2009

58
Sample Evaluation Plan
The project has thre e evaluation questions:
1. H ow many course s are utilizing N LM resources?
2....
Sample Evaluation Plan
Process Monitorin g
An Exce l workbook will b e created to tra ck the follo win g activities:
•
•
•...
Sample Evaluation Plan
Summative Evaluation
The summa tive evaluation will respond to each of the evaluation qu estion s. ...
What Reviewers Look For

June 18, 2009

62
Proposal Evaluation Plans
People have had many workshops on
how to write evaluation plans for
proposals.
– What do the rev...
What Reviewers See…
• “We will conduct surveys at the end
of every training session”
• “We will conduct pre/post tests to…...
What Reviewers See…
Sounds good!!!
What’s the problem???
– Evaluation plan has no clear connection
to the goals and object...
What Reviewers Should See
• Evaluation Questions based on the
project goals and objectives
• Defined metrics
• Defined dat...
Sustainability
– What do the reviewers see?
– What should the reviewers see?

June 18, 2009

Do not reproduce without perm...
What Reviewers See…
• “We will work to identify more
funding”

What Reviewers Should See…
• A more in-depth understanding ...
Publicizing Your Results

June 18, 2009

69
Publishing Your Results
• Identify a publication that would have an interest in
your study
• Have a full understanding of ...
Sample Publication Guidelines
http://www.ehealthinternational.org/guidelines.htm
•
AUTHOR'S GUIDELINES
Manuscripts should ...
Sample Publication Guidelines
http://www.ehealthinternational.org/guidelines.htm
• References
References are numbered cons...
Questions?
Barry Nagle
Director
Center for Assessment, Planning, and Accountability
United Negro College Fund Special Prog...
Upcoming SlideShare
Loading in …5
×

Crafting your evaluation plan

398 views

Published on

Published in: Health & Medicine, Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
398
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
8
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Crafting your evaluation plan

  1. 1. Crafting Your Project’s Evaluation Plan Presentation to Hands-on Proposal Development Workshop June 18, 2009 June 18, 2009 1
  2. 2. Agenda • • • • • • Goals and Objectives Evaluation Definition and Types Writing Your Evaluation Plan Data Analysis and Reporting What Reviewers Look For Publicizing Your Results June 18, 2009 Do not reproduce without permission 2
  3. 3. Goals and Objectives June 18, 2009 3
  4. 4. Goals/Objectives • The most important element of a successful program is the development of attainable goals and measurable objectives – Guides program planning and design – Communicates to stakeholders – Enables evaluation • Success is dependent upon realistic goals June 18, 2009 Do not reproduce without permission 4
  5. 5. Goals: Characteristics • Describe the overall purpose of the program • Describe broad outcomes and concepts (what we want to accomplish) • Expressed in general terms. June 18, 2009 Do not reproduce without permission 5
  6. 6. Goals: Development Steps • • • • Research the topic (define needs) Involve stakeholders (gains commitment) Brainstorm goals Select the goals that have priority (decide on what matters) • Limit the program to two-five goals (select realistic goals) June 18, 2009 Do not reproduce without permission 6
  7. 7. Goals: Samples • The NLM databases will become an integral component of the institution’s public health department instruction • The NLM databases will become a valuable public health resource for senior citizens in the community • The project will identify the methods most effective in increasing the utilization of the NLM databases June 18, 2009 Do not reproduce without permission 7
  8. 8. Objectives • Specifically state how the goals will be achieved • Are measurable: Define what you want to see • Encourage a consistent focus on program functions June 18, 2009 Do not reproduce without permission 8
  9. 9. Objectives Are Not… Tasks • Conducting a training session is a task. – Poor objective: We will conduct a training session • An effective objective defines intent – Better objective: Faculty that attend the training session will create one or more activities to instruct students on the NLM database June 18, 2009 Do not reproduce without permission 9
  10. 10. How to be SMART June 18, 2009 Do not reproduce without permission 10
  11. 11. SMART Objectives • Specific: Be precise about what you are going to achieve • Measurable: Quantify the objectives • Appropriate: Align with the needs of the target audience • Realistic: Do you have the resources to make the objective happen? • Time-Specific: State when you will achieve the objective June 18, 2009 Do not reproduce without permission 11
  12. 12. SMART: Specific Objectives Specific: Be precise about what you are going to achieve – – – – – – Specify target Specify intended output One output per objective Avoid vague verbs (e.g. know, understand) Make sure the objective is linked to the goal Sample: By January 2010, all students in the Public Health course will utilize one or more NLM resources in their final project June 18, 2009 Do not reproduce without permission 12
  13. 13. S MART: Measurable Objectives Measurable: Quantify the objectives – Use measures as indicators of program success – If possible, establish a baseline (e.g. In January 2009, 5% of the public health majors utilized NLM resources in their final project) – Sample: By January 2010, all students in the Public Health course will utilize one or more NLM resources in their final project June 18, 2009 Do not reproduce without permission 13
  14. 14. ART: Appropriate Objectives SM Appropriate: Align with the needs of the target audience – – – – Meeting the objective will advance the goal Identify a specific target audience Are inclusive of diversity within your group Sample: By January 2010, all students in the Public Health course will utilize one or more NLM resources in their final project – Note: The “A” is sometimes called “Attainable” or “Achievable” in the literature. June 18, 2009 Do not reproduce without permission 14
  15. 15. SMA RT: Realistic Objectives Realistic : Do you have the resources to make the objective happen? – – – – Are important to stakeholders Are adequately resourced Can be achieved Sample: By January 2010, all students in the Public Health course will utilize one or more NLM resources in their final project Take care on what you say you can do! Is it realistic for all (100%) students to utilize NLM resources in their final project? June 18, 2009 Do not reproduce without permission 15
  16. 16. T: Time-Specific Objectives SMAR Time-Specific: State when you will achieve the objective – Provide timeframe indicating when objective will be met – Sample: By January 2010, all students in the Public Health course will utilize one or more NLM resources in their final project June 18, 2009 Do not reproduce without permission 16
  17. 17. Goals and Objectives Objective One Goal Objective Two Objective Three Maintain a clear connection between your goals and objectives. By maintaining this connection, you are articulating your theory of goal attainment. June 18, 2009 Do not reproduce without permission 17
  18. 18. Goals and Objectives Final Note: – The Goals and Objectives communicate your intended results – Know your story. When stakeholders want to know what your program will do, connect all of your activities to your goals and objectives – Have an elevator speech June 18, 2009 Do not reproduce without permission 18
  19. 19. SMART Tool June 18, 2009 Do not reproduce without permission 19
  20. 20. SMART Tool Goal: The National Library of Medicine’s databases will become an integral component of the institution’s public health department instruction Objective By January 2010, all students in the Public Health course will utilize one or more NLM resources in their final project Verb Breakdown Objective Metric Population Object Baseline Measure Goal Measure Timeframe utilize number Public Health Students NLM Resource -- all January 2010 By January 2010, at least 10 public health courses from a baseline of zero courses will apply an NLM database in course instruction Verb Breakdown June 18, 2009 Metric Population Object Baseline Measure Goal Measure Timeframe apply Number Public Health Courses NLM databases 0 10 January 2010 Do not reproduce without permission 20
  21. 21. SMART Benefits and Costs June 18, 2009 Do not reproduce without permission 21
  22. 22. Benefits • • • • Facilitates communication with program stakeholders Informs on what data should be collected Enables effective program management Facilitates the linkage of activities and intended effects/goals • Enables a focus on evaluation – Process level (activities) – Output level – Outcome level • Facilitates replication June 18, 2009 Do not reproduce without permission 22
  23. 23. Costs and Limitations • Impression that creativity is limited • Time-consuming • GI/GO • Encourages too great a focus on discrete measures June 18, 2009 Do not reproduce without permission 23
  24. 24. Comment on Metrics • A well-written objective suggests the metric(s) • Example: – By January 2010, all students in the Public Health Class will complete a project that uses one or more NLM resources • Metrics: – Total number of students – Total number that use an NLM resource in their project • While this may appear obvious, this is an area where programs often fail. June 18, 2009 Do not reproduce without permission 24
  25. 25. Evaluation: Definitions and Types June 18, 2009 25
  26. 26. Evaluation What is it? Why do I HAVE to do it?? We will: – Define Evaluation – Introduce Evaluation Types – Review Why Evaluation is Important June 18, 2009 Do not reproduce without permission 26
  27. 27. Evaluation What is it? Evaluation… – Assesses program achievements or progress – Enables a data-based judgment on program quality – Enables the program to build on strengths and minimize challenges June 18, 2009 Do not reproduce without permission 27
  28. 28. Evaluation What kinds are there? – Monitoring: Keeping track of what the program is doing – Formative: Progress towards meeting goals and objectives – Summative: Documents what goals have been met June 18, 2009 Do not reproduce without permission 28
  29. 29. Evaluation Why do I have to evaluate? – Provides systematic information to continually improve your program – When people ask if you have met your goals, you need to be able to say, “yes,” “no,” or “maybe.” Your answer should not be “I don’t know.” June 18, 2009 Do not reproduce without permission 29
  30. 30. Evaluation Types 1. Process Monitoring 2. Formative evaluation 3. Summative evaluation • • Outputs Outcomes June 18, 2009 Do not reproduce without permission 30
  31. 31. Process Monitoring What are we doing? – This evaluation type is a continuous activity • Is the data being captured in an organized fashion? • Is the program collecting the type of information that is needed for final reports and evaluations? Tip: The answers should be at the tip of your fingers! June 18, 2009 Do not reproduce without permission 31
  32. 32. Formative Evaluation Are we making progress towards meeting our goals? Are mid-course corrections needed? – Occurs during program operation • Is the program maintaining a focus on its planned activities? • Does the monitoring data give evidence that goals and objectives will be met? • What steps need to be taken to continue progress or are adjustments needed? • If adjustments are needed, have all stakeholders been informed? June 18, 2009 Do not reproduce without permission 32
  33. 33. Summative Evaluation What impact did we have? What were our Outputs and Outcomes – Occurs at conclusion of program • • • • What were our program outputs? What were our program outcomes? Did we achieve our goals and objectives? What improvements can be made to make the program stronger? June 18, 2009 Do not reproduce without permission 33
  34. 34. Summative Evaluation What are outputs and outcomes? – Outputs are the program results that can be quantified (e.g. the number of training sessions, the number of participants in the training sessions) – Outcomes are the goals of the program that typically need a more in depth evaluation (e.g. How do senior citizens utilize the NLM databases?) Reminder: your program outputs and outcomes are specifically related to your program goals and objectives June 18, 2009 Do not reproduce without permission 34
  35. 35. Writing Your Evaluation Plan June 18, 2009 35
  36. 36. Evaluation Plan • Think about and plan for how you will do evaluation now. • If you don’t think about evaluation now, you may miss important opportunities to collect data that could improve your program. June 18, 2009 Do not reproduce without permission 36
  37. 37. Evaluation Plan Components Your Evaluation Plan Should Have these Components: – Evaluation Questions – Methodology • Process Monitoring: Outputs • Formative Evaluation: Progress towards meeting outputs and outcomes • Summative Evaluation: Outcomes June 18, 2009 Do not reproduce without permission 37
  38. 38. Evaluation Questions Similar to how goals and objectives guide program development, Evaluation Questions guide program assessment and evaluation. We will review: – Definition – Development June 18, 2009 Do not reproduce without permission 38
  39. 39. Evaluation Questions What are they? – A good evaluation question specifically outlines what is being assessed and suggests the data needed – Evaluation question types reflect the program stage (i.e. formative, summative) • Formative: What are the best media channels to reach the target audience about NLM resources? • Summative: How effective are select media channels in reaching our target audience? June 18, 2009 Do not reproduce without permission 39
  40. 40. Evaluation Questions How do I develop evaluation questions? – The first step is to refer back to the SMART objectives. If the objectives reflect what the program was trying to do, the evaluation should assess this – The second step is to form the questions • Objective: By January 2010, ten courses will utilize NLM resources in their classes – Evaluation Question One: How many courses are utilizing NLM resources? – Evaluation Question Two: What do faculty identify as factors in their decisions to utilize NLM resources? – Evaluation Question Three: How do faculty utilize NLM resources? – The third step is to re-evaluate the questions. Are these the kind of questions needed to inform on program success? June 18, 2009 Do not reproduce without permission 40
  41. 41. Evaluation Plan I have evaluation questions… what do I do now? – Develop a methodology – What data and data sources will you need? • Develop an evaluation crosswalk • How will you analyze the data? • Can you do it or do you need help? – How will you report the results? June 18, 2009 Do not reproduce without permission 41
  42. 42. Evaluation Plan Methodology • Outline your steps by Evaluation Stage – Process Monitoring: In the first month of the grant, an Excel spreadsheet will be created to track all project activities – Summative Evaluation: In the first month of the grant, all surveys will be created and tested. June 18, 2009 Do not reproduce without permission 42
  43. 43. Evaluation Plan • An evaluation crosswalk defines what data sources will be used to inform on the evaluation question. Course Syllabi How many courses are utilizing NLM resources? What do faculty identify as factors in their decisions to utilize NLM resources? How do faculty utilize NLM resources? June 18, 2009 Faculty Interviews √ Faculty Survey Student Surveys √ √ √ √ √ Do not reproduce without permission √ 43
  44. 44. Data Analysis and Reporting June 18, 2009 44
  45. 45. Data Analysis I have data… what do I do now? We will review – Data types – Analysis June 18, 2009 Do not reproduce without permission 45
  46. 46. Data Analysis Data Types: Quantitative Data – Measurable and tangible – Involves the counting of people, behaviors, conditions, or other events – Enables the use of statistics to answer questions June 18, 2009 Do not reproduce without permission 46
  47. 47. Data Analysis Steps – Understand your data! • Be able to explain what the numbers mean – Organize the data • Enter the data in a program like Excel or SPSS. (Learn how to use the pivot table function in Excel!) – “Clean” the data • Look at the data. Are there data entry mistakes? Does something look odd? Check and fix mistakes! – Compile the data • Summarize the data in tables or graphs June 18, 2009 Do not reproduce without permission 47
  48. 48. Data Analysis Tips – Make it simple! It is not effective when no one understands your results – Increase the white space! Graphs and tables are effective communicators – When the number of people is less than 30, report numbers. June 18, 2009 Do not reproduce without permission 48
  49. 49. Data Analysis Data Types: Qualitative Data – Data is rich in detail and description – Text or narrative format – Examples: interviews, case studies, focus groups, or document review. June 18, 2009 Do not reproduce without permission 49
  50. 50. Data Analysis Steps – Organize the data • Enter data into a program like Excel – “Clean” the data • Read the data. Correct data entry errors. (Caution: Do not change the wording of what was recorded) – Label/Code the data • Give each group of data a label/code. Iterative process to finalize the labels/codes – Compile the data • Group like labels/codes together to see what the data is telling you. June 18, 2009 Do not reproduce without permission 50
  51. 51. Data Analysis Tips – Maintain objectivity. Recognize and limit your biases – Where possible, quantify the qualitative data (e.g. 25 people said something was important) – Visuals can help readers understand qualitative data June 18, 2009 Do not reproduce without permission 51
  52. 52. Data Analysis Overall Tips – Remember: The data is used to answer the evaluation questions – Quantitative data and qualitative data can be used to support one another – When possible, have others look at your analysis June 18, 2009 Do not reproduce without permission 52
  53. 53. Reporting Tell your story! – Report Sections – Tips – Understand Required Reporting June 18, 2009 Do not reproduce without permission 53
  54. 54. Reporting Report Sections – Program Background: What are the program goals and objectives – Program Activities: What did you do? Be specific. Include dates, number of activities, activity types, etc. – Methodology: State how you are evaluating program effectiveness. June 18, 2009 Do not reproduce without permission 54
  55. 55. Reporting Report Sections – Results: List the results from your data collection instruments – Analysis/Conclusions: What do the results mean? Was the program successful? Were there things that could have been improved – Next Steps: How will you use these results to keep the program growing? June 18, 2009 Do not reproduce without permission 55
  56. 56. Reporting Tips – You are doing important things. Do not brag but do not minimize your accomplishments – Be honest… if the data indicates something did not go well, state this – All conclusions on strengths and weaknesses should be linked to the data – Know your audience: customize your report to who you talk to – Do not extrapolate past the data June 18, 2009 Do not reproduce without permission 56
  57. 57. Reporting Required Reporting – Funders often have reporting requirements – Request that the funder defines data that will be required – Incorporate this data collection into your process monitoring June 18, 2009 Do not reproduce without permission 57
  58. 58. Sample Evaluation Plan June 18, 2009 58
  59. 59. Sample Evaluation Plan The project has thre e evaluation questions: 1. H ow many course s are utilizing N LM resources? 2. W hat do facu lty identify as factors in their decision s to utilize NLM resources? 3. H ow do faculty utilize NLM resources? The crosswalk below disp la ys the data sources fo r ea ch evaluation que stion . Course Syllab i Ho w many cour se s are u tilizing NLM r eso u rces? Wh at d o facult y id en ti fy as fact ors in their deci sio ns to uti lize NLM r eso u rces? Ho w do facu lty uti lize NLM r eso u rces? Faculty Interv iews v Faculty Survey Stu dent Surveys v v v v v v The evaluation comp onents are Process Monito ring, Formative Evaluation, and Summative Evaluation. June 18, 2009 Do not reproduce without permission 59
  60. 60. Sample Evaluation Plan Process Monitorin g An Exce l workbook will b e created to tra ck the follo win g activities: • • • • Χουρ σ υτιλ σε ιζινγ Ν ΛΜ ρε σουρ σ βψ σε µ ε στε ρ χε Τρ αινινγ σε σσιονσ χονδυχτε δ Νυµ βε ρ οφ π αρτιχιπ αντσ πε ρ τρ αινινγ σε σσιον βψ τψπε (φαχυλ στυδε ντσ) τψ, Ηοω αχτιϖ σ ωε ρ ε ϖ υατε δ ιτιε ε αλ Τηε Πρ ινχιπαλΙνϖ στιγ ατορ ωιλ βε ρ σπ ονσιβλ φ ε ντε ρ ανδ µ αινταινινγ τηε δατα. ε λ ε ε ορ ινγ Τρ αχκινγ τηισ ινφορµ ατιον ωιλ προϖιδ ε τηε προγ ρ λ αµ µ αναγ ε ρ ωιτη α χο µ πρε ηε νσιϖ σ ε λ ιστιν γ οφ π ρ ρ ογ αµ αχτιϖιτιε σ ανδ παρτιχιπαντσ. Τηε Εξχε λωορκβοο κ ω ιλ βε χρ ατε δ τηε λ ε φιρ µ οντη οφ τηε προγ ρ . στ αµ Φορ ατιϖε Ε ϖαλ µ υατιον Το ε ϖ υατε π ρογ ρ αλ αµ στατυσ, τηε ΠρινχιπαλΙνϖε στιγ ατορ ωιλ χρε ατε α Προϕ χτ Αχτιϖ λ ε ιτψ Χηε χκλ τηατ λ ιστ ιστσ αλ αχτιϖ σ. Ο ν α θυαρτε ρ ψ βασισ, ε αχη π ρ ε χτ τε αµ µ ε µ βε ρ ωιλ λ ιτιε λ οϕ λ ινδε πε ν δε ντλ ρ ψ ατε τωο αρ ασ φ ε αχη αχτιϖ ε ορ ιτψ. Τηε φιρ αρε α ισ ωηε τηε ρ τηε αχτιϖ στ ιτψ χο ντρ ιβυτε σ το οβταινινγ τηε προϕ χτ γ οαλ Τηε σε χον δ αρε α ισ ωηε τηε ρ ε αχη αχτιϖ ισ ε σ. ιτψ χο µ πλ τε , ον σχηε δυλ , βε ηινδ σχηε δυλ , ορ νοτ σταρτε δ. Τηε ΠρινχιπαλΙνϖε στιγ ατορ ωιλ ε ε ε λ ορ ανιζε αλ τηε ρε σπονσε σ οφ τηε τε αµ µ ε µ β ε ρ Τηε τε αµ ωιλ ρ ϖιε ω τηε ρε συλ το γ λ σ. λε τσ δε τε ρµ ινε ιφ ανψ ιντε ρ ντιον σ αρε ν ε ε δε δ το µ αιν ταιν προγ ρ σσ το ωαρ τηε γ ο αλ Α ϖε ε δσ σ. στατυσ ρε πορτ ωιλ βε συβµ ιττε δ πρ ρ λ ογ αµ µ αναγ ε ρ ε αχη θυαρ ρ πριορ το τηε χλ τε οσε οφ τηε γ ρ τ. αν June 18, 2009 Do not reproduce without permission 60
  61. 61. Sample Evaluation Plan Summative Evaluation The summa tive evaluation will respond to each of the evaluation qu estion s. Th e data so urces an d how they will be utilized are: • • • • Χ ουρ Σ ψλαβι σε λ Φαχυλ Ιντε ρϖ ωσ τψ ιε Φαχυλ Σ υρ ε ψ τψ ϖ Σ τυδε ντ Συρ ψ ϖε Α χουρ σψλαβ ι χηε χκλ ωιλ βε χρ ατε δ τηε δοχυµ ε ντ ωηε τηε ρ τηε χουρ ινχλ σ σε λ ιστ λ ε σε υδε ΝΛΜ χοντε ντ. Σ ψλαβι ωιλ β ε χηε χκε δ ατ τηε βε γ ιννινγ οφ τηε γ ρ πρ ραµ ανδ τηε ε νδ λ λ αντ ογ οφ τηε γ ρ προγ ρ το δ ε τε ρµ ινε ωηε τηε ρ τηε ρε ηασ βε ε ν αν ινχρ ασε ιν ΝΛΜ ρ σουρ αντ αµ ε ε χε υτιλ ιζατιον. Τηε φαχυλ ιντε ρ ιε ω πρ τψ ϖ οτοχο λ φαχυλ συρϖ ψ, ανδ στυδε ντ συρϖ ψ ωιλ βε , τψ ε ε λ χρε ατε δ ιν τηε φιρστ µ οντη οφ τηε γ ρ προγ ρ αν δ συβµ ιττε δ το τηε ιν στιτυτιον ’σ αντ αµ Ινστιτυτιο ναλΡ ε ϖ ω Βοαρδ φορ αππρ ϖ . ιε ο αλ ∆ ατα χολε χτιον αν δ αναλ λ ψσισ ωιλ οχχυρ ιν τηε φ λ ιναλ µ ο ντη οφ τηε γ ραντ πε ρ . ∆ ατα ιοδ αναλ ψσισ ωιλ ινφορ ον ωηε τηε ρ τηε πρ ϕ χτ γ οαλ ηαϖε βε ε ν µ ε τ. Τηε φ λ µ οε σ ιναλ ε ϖ υατιον αλ ρ π ορ ωιλ ηαϖε τηε σε σε χτιονσ: Πρ ρ ε τ λ ογ αµ Βαχκγ ρ ουνδ, Πρ ρ ογ αµ Αχτιϖιτιε σ, Πρ ε χτ οϕ ∆ ε σχρ ιπτιον, Με τηοδ ολ ψ, Ρ ε συλ Αναλ ογ τσ, ψσισ/Χον χλ υσιον, Ν ε ξτ Σ τε πσ. June 18, 2009 Do not reproduce without permission 61
  62. 62. What Reviewers Look For June 18, 2009 62
  63. 63. Proposal Evaluation Plans People have had many workshops on how to write evaluation plans for proposals. – What do the reviewers see? – What should the reviewers see? June 18, 2009 Do not reproduce without permission 63
  64. 64. What Reviewers See… • “We will conduct surveys at the end of every training session” • “We will conduct pre/post tests to…” • “We will have a focus group…” June 18, 2009 Do not reproduce without permission 64
  65. 65. What Reviewers See… Sounds good!!! What’s the problem??? – Evaluation plan has no clear connection to the goals and objectives specified in the proposal. • Writers often lay out plans to evaluate activities that are part of the program but not the overall program June 18, 2009 Do not reproduce without permission 65
  66. 66. What Reviewers Should See • Evaluation Questions based on the project goals and objectives • Defined metrics • Defined data sources A quality evaluation plan in a proposal describes how the overall success of the program will be determined June 18, 2009 Do not reproduce without permission 66
  67. 67. Sustainability – What do the reviewers see? – What should the reviewers see? June 18, 2009 Do not reproduce without permission 67
  68. 68. What Reviewers See… • “We will work to identify more funding” What Reviewers Should See… • A more in-depth understanding of sustainability – What will be the lasting impacts of the program if it is successful? June 18, 2009 Do not reproduce without permission 68
  69. 69. Publicizing Your Results June 18, 2009 69
  70. 70. Publishing Your Results • Identify a publication that would have an interest in your study • Have a full understanding of the publication’s submission guidelines (style, deadlines, etc.) • PROOFREAD! You may have an outstanding study but your submission loses credibility if there are grammar or spelling errors • Do not be discouraged by a “no”! Keep trying and listen to the reviewer comments. RESUBMIT!! June 18, 2009 Do not reproduce without permission 70
  71. 71. Sample Publication Guidelines http://www.ehealthinternational.org/guidelines.htm • AUTHOR'S GUIDELINES Manuscripts should be submitted electronically to the Managing Editor, Hasan Sapci, M.D. to the following email address: mdsapci@umich.edu. The submission should include a cover letter to provide a very brief description of the topic of the paper together with an explanation that the manuscript is original and not submitted elsewhere. • Manuscript Preparation Each manuscript should have a title page including a short running title as well as a listing of all authors. The list of authors should include names, degrees and institutional affiliation, as well as a complete postal mailing address, fax, telephone, and e-mail address for each author. In addition, a corresponding author should sign the cover letter. Each manuscript should have an abstract of not more than 250 words, without any citations or references. The abstract must include a statement of the problem addressed in the paper, the methodology used in the analysis, the main findings, and conclusions, as appropriate. Manuscripts should be prepared in Microsoft Word or Word Perfect. Tables and Figures should be submitted separately, preferably in uncompressed TIFF, BMP or PNG format. We recommend reading these articles about formatting and style: - Uniform Requirements for Manuscripts Submitted to Biomedical Journals: Writing and Editing for Biomedical Publication http://www.icmje.org/index.html - American Medical Association Manual of Style: A Guide for Authors and Editors. 9th ed. Baltimore, Md.: Williams & Wilkins, 1998. June 18, 2009 Do not reproduce without permission 71
  72. 72. Sample Publication Guidelines http://www.ehealthinternational.org/guidelines.htm • References References are numbered consecutively in the text as superscripts, beginning with number 1. When the reference is at the end of a sentence, punctuation should precede the superscript. When in the middle of a sentence, superscripts are included in the text without punctuation. The list of references at the end of the manuscript should be numbered consecutively according to the order in which they appear in the text. Journal names must be abbreviated according to the style of Index Medicus. Copyright Authors are responsible for obtaining permission to use published material, including their own work. Permission must be provided in writing from the original copy right holder ( typically the publisher, not the editor, except for unpublished material where the original author is the copyright holder). June 18, 2009 Do not reproduce without permission 72
  73. 73. Questions? Barry Nagle Director Center for Assessment, Planning, and Accountability United Negro College Fund Special Programs Corporation 2750 Prosperity Avenue, Suite 600 Fairfax, VA 22031 barry.nagle@uncfsp.org 703-205-8139 June 18, 2009 Do not reproduce without permission 73

×