• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Research Policy & Evaluation: Checking Assumptions to Accomplish Collaborative Evaluation
 

Research Policy & Evaluation: Checking Assumptions to Accomplish Collaborative Evaluation

on

  • 724 views

 

Statistics

Views

Total Views
724
Views on SlideShare
722
Embed Views
2

Actions

Likes
0
Downloads
2
Comments
0

1 Embed 2

http://www.slideshare.net 2

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • What organization do you represent? What do you hope to get out of this session? (why did you choose to attend?) Presenters will use this information to tailor their comments to the audience’s interest.
  • What organization do you represent? What do you hope to get out of this session? (why did you choose to attend?) Presenters will use this information to tailor their comments to the audience’s interest.
  • MGP Program Demographics (10 mins) What year did this summer program begin serving students? Number of youth served annually by the summer program At how many sites does the summer program operate? This program serves children entering grades: Student/staff ratio Geographic Location Program Location How many hours of programming does each student have the opportunity to receive through this summer program? Special populations served, if any (ex. low-income, special education, limited English, etc.) What was the summer program’s total program budget for 2008? The program’s three primary sources of funding for summer 2008 and how much funding (dollar amount) was received from each source (ex. foundations, fee-for-service, Title I, 21st CCLC, etc.)
  • MGP Program Demographics (10 mins) What year did this summer program begin serving students? Number of youth served annually by the summer program At how many sites does the summer program operate? This program serves children entering grades: Student/staff ratio Geographic Location Program Location How many hours of programming does each student have the opportunity to receive through this summer program? Special populations served, if any (ex. low-income, special education, limited English, etc.) What was the summer program’s total program budget for 2008? The program’s three primary sources of funding for summer 2008 and how much funding (dollar amount) was received from each source (ex. foundations, fee-for-service, Title I, 21st CCLC, etc.)
  • MGP Program Demographics (10 mins) What year did this summer program begin serving students? Number of youth served annually by the summer program At how many sites does the summer program operate? This program serves children entering grades: Student/staff ratio Geographic Location Program Location How many hours of programming does each student have the opportunity to receive through this summer program? Special populations served, if any (ex. low-income, special education, limited English, etc.) What was the summer program’s total program budget for 2008? The program’s three primary sources of funding for summer 2008 and how much funding (dollar amount) was received from each source (ex. foundations, fee-for-service, Title I, 21st CCLC, etc.)
  • PUEO Program Demographics (10 mins)
  • Deconstruct Key Evaluation Framing Documents (30 mins) Presenters will distribute the PUEO program logic model and the MGP evaluation questions to attendees and ask them to form small groups. In small groups, participants will deconstruct these documents, addressing questions such as: Why would a program use this document? For what purpose? What would evaluators need to know to write this document? Which program staff would be involved? When? How long? Distribute “Areas You Should Consider” for your evaluation. (Brenda will write.) Presenters will lead a discussion based on the questions posed to the small groups.
  • Partnering with Evaluators/Open Discussion (25 mins) Two types of evaluators: already have model and feel good about where it is in development. Someone to assess and validate prog. Random assignment or quasi-experimental. Collaborative Stakeholder Approach: Really focused on program development. Are model components the right one. Really looking for collaborative relationship with an evaluator. Process model allows you to change things as you go. Become an informed consumer. (Brenda) The program directors and their evaluator partners will speak about their experience partnering together, the tools they used during the evaluation (surveys, focus groups, activity observation, quantitative analysis), and how evaluator feedback was used to inform program growth. Sufficient time will be available for participant questions regarding the information covered.

Research Policy & Evaluation: Checking Assumptions to Accomplish Collaborative Evaluation Research Policy & Evaluation: Checking Assumptions to Accomplish Collaborative Evaluation Presentation Transcript

  • Checking Assumptions to Accomplish Collaborative Evaluation National Partnership for Educational Access 2 nd Annual Conference April 8, 2010
  • A little about us… Carl Ackerman Clarence T. C. Ching PUEO (Partnerships in Unlimited Education) Program Brenda McLaughlin National Summer Learning Association Beth Casey Middle Grades Partnership
  • A little about you…
    • What organization do you represent?
    • What do you hope to learn?
  • What do kids stand to lose?
    • Academic knowledge
    • Healthy habits
    • Access to meals
    • Technology know-how
  • What do kids stand to lose? Academic Knowledge
    • Since 1906, numerous studies have confirmed that children experience learning losses in math and reading without continued opportunities for regular practice (White, Heyns, Cooper, Alexander)
    • Disadvantaged youth are disproportionately affected by losses in literacy skills
  • What do kids stand to lose? Academic Knowledge
    • Two-thirds of the ninth grade reading achievement gap can be explained by unequal access to summer learning opportunities, contributing to fewer disadvantaged youth graduating from high school or entering college (Alexander, Entwisle & Olson, 2007)
  • Sources: Doris Entwisle, Karl Alexander, and Linda Olson, Children, Schools, and Inequality , 1997, Table 3.1 Disadvantaged, by Year Better-Off, by Year -10 40 90 140 190 1 2 3 4 5 -10 40 90 140 190 1 2 3 4 5 Disadvantaged, by Year Better-Off, by Year -10 40 90 140 190 1 2 3 4 -10 40 90 140 190 1 2 3 4 SCHOOL YEAR CUMULATIVE GAINS SUMMER CUMULATIVE GAINS
  • What do kids stand to lose? Healthy Habits
    • Children gain BMI nearly twice as fast during the summer as during the school year (von Hippel, Powell, Downey & Rowland, 2007)
    • Black and Hispanic children, and children who are already overweight, experience healthier BMI gain during the school year
    • School-based fitness interventions can promote better health, but without sustained intervention these benefits are lost over the summer break (Carrel et al., 2007)
  • What do kids stand to lose? Access to Meals
    • In July 2008, 17.3 children received Summer Nutrition for every 100 low-income students who received lunch in the 2007-2008 school year (FRAC)
    • Where you live makes a difference!
        • Low of 4.4% in Mississippi to a high of 88.8% in Washington DC
        • Only 10 states manage to serve 25% of their kids
        • 11 states serve less than 10%
  • What do kids stand to lose? Technology Know-How
    • Library and technology usage differs by income (Neumann & Celano, 2008)
    • Print materials in the library:
        • Lower-income children choose less challenging material , with less print and lower reading levels
        • Read 1 line of print for every 3 read by middle income children
        • Spent less time with each book – 6.6 versus 12 minutes
  • Evaluation Considerations
    • How is our population the same as or different from the research? (demographics)
    • How is our program being implemented? (process)
    • Is our program being implemented as we intend it to? (fidelity to implementation)
    • What parts of our program are influencing our results? (quality)
    • Are we making a difference? (impact)
  • MGP Background
    • Pilot summer – 2005
    • 500 students annually
    • 13 sites involving 20 schools and three universities
    • This program serves children entering grades: 7 th , 8 th , 9 th and 10 th
  • MGP Background
    • Student/staff ratio: 1 to 6
    • Urban
    • Baltimore, Maryland
    • 162 hours per summer
  • MGP Background
    • 2008 budget: $1.6 million
    • 100% foundation or private individual funded in 2008
  • PUEO Background
    • Started in 2005
    • Served 160 participants in 2009. Anticipate serving 200 in 2010.
    • Operates a single site. Older participants may meet at a different site.
    • 6 th through 12 th
  • PUEO Background
    • Student/staff ratio varies by age group. 1:5 for younger children to 1:13 for oldest
    • Participants from both urban and rural backgrounds
    • Honolulu, Hawai’i at Punahou School
    • 100+ hours of programming, depends upon level
  • PUEO Background
    • Participants are all from low-income families. Some from immigrant households.
    • Total Budget: $411,906
    • Supported through in-kind support from Punahou School and foundations
  • Key Evaluation Documents Logic Model & Evaluation Questions
    • Why would a program use this document? For what purpose?
    • What would evaluators need to know to write this document?
    • Which program staff would be involved? When? How long?
  • Partnering with Evaluators Two Approaches
    • Assess & Validate
        • Usually random assignment or quasi-experimental
    • Collaborative Stakeholder Approach
        • Program development focused
        • Process model
    • Not mutually exclusive
    • Be an informed consumer!
  • Contact Us Carl Ackerman [email_address] Beth Casey [email_address] Brenda McLaughlin [email_address]