Your SlideShare is downloading. ×
  • Like
  • Save
Digging Deeper Into “Why” and “How”: Evaluation design ideas to further increase knowledge about programs that work for students
Upcoming SlideShare
Loading in...5

Thanks for flagging this SlideShare!

Oops! An error has occurred.


Now you can save presentations on your phone or tablet

Available for both IPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Digging Deeper Into “Why” and “How”: Evaluation design ideas to further increase knowledge about programs that work for students


From the Penn IUR and Penn GSE sponsored conference: …

From the Penn IUR and Penn GSE sponsored conference:

“Preparing Today’s Students for Tomorrow’s Jobs in Metropolitan America: The Policy, Practice and Research Issues"

May 25-26, 2011

Organized by Laura Perna, a professor in Penn GSE, and Susan Wachter, a professor in Penn’s Wharton School, “Preparing Today’s Students for Tomorrow’s Jobs” explores the most effective institutional and public-policy strategies to be sure high school and college students and adult learners have the knowledge and skills required for future employment.

“The conference addresses such critical questions as: How do we define success with regard to the role of education in preparing students for work?” Perna said. “How well are different educational providers preparing future workers? What is the role of public policy in improving connections between education and work?

“It seeks to improve our understanding of several fundamental dimensions of this issue through insights from federal, state and local policy leaders, college administrators and researchers.”

Guest speakers include Eduardo Ochoa, assistant secretary of the U.S. Department of Education; former Pennsylvania Gov. Edward Rendell; Lori Shorr, chief education officer to Philadelphia Mayor Michael Nutter; Charles Kolb from the Committee for Economic Development in Washington, D.C.; Claudia Neuhauser from the University of Minnesota; Bethany Krom from the Mayo Clinic; and Harry Holzer from Georgetown University.

“Much recent attention focuses on the need to improve high school graduation and college degree completion. But, relatively less attention has focused on whether graduates and degree recipients have the skills and education required by employers,” Perna said.

The event is sponsored by the Penn’s Pre-Doctoral Training Program in Interdisciplinary Methods for Field-Based Research in Education, with funding from the U.S. Department of Education’s Institute for Education Sciences in collaboration with Penn’s Institute for Urban Research.

Published in Education
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads


Total Views
On SlideShare
From Embeds
Number of Embeds



Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

    No notes for slide
  • Impact, low fidelity: (What could be strengthened? What was added/omitted that wasn’t planned?)No impact, high fidelity: (Was theory wrong? Intervention too weak?)No impact, low fidelity: (Was theory wrong? Was theory right, but implementation too poor?)


  • 1. Digging Deeper Into “Why” and “How”:
    Evaluation design ideas to further increase knowledge about programs that work for students
    Lashawn Richburg-Hayes
    Preparing Today’s Students for Tomorrow’s Jobs in Metropolitan America
    University of Pennslyvania, May 25, 2011
  • 2. “Guidance” is a catch-all…
    Academic advising and counseling
    (e.g. orientation, info. on navigating the college, assistance in course selection…)
    Academic supports and “coaching”
    (e.g. tutoring, remedial assistance…)
    Personal guidance and counseling
    (e.g. crisis intervention, info & referral…)
    Career counseling
    (e.g. aptitude assessments, career planning…)
    Supplemental services
    (childcare subsidies, transportation, books…)
  • 3. 3
    More intensive / “heavier touch”
    Less intensive / “lighter touch”
  • 4. Striking similarity of results
  • 5. Questions…
    What is the theory of change?
    How well were interventions implemented?
    What components matter most? In-person contact or structured connection? How much dosage is enough?
    Why are there differences by gender?
    Cost effectiveness
  • 6. What does a lack of impact imply about whether the program works?
  • 7. Ultimate Goal: Avoid “Black Box” Evaluation
    “Simple…black box evaluations may provide a gross assessment of whether or not a program works but fail to identify the underlying causal mechanisms that generate the treatment effects, thus failing to pinpoint the deficiencies of the program for future program improvement of development.” (Chen, 1990)
  • 8. Helpful solutions…
    Driver diagrams
    Intervention strength and treatment contrast
    Logic models
    Implementation fidelity
  • 9. Driver diagram
  • 10. Strength & Treatment contrast
    Planned/expected services
    Offered services
    Received services
    Control Services
    Control Services
    Strength of program and control services measured
    Treatment Contrast
    Fidelity to design
    Received Intervention Services
    Intervention Services
    Strength of program defined
    Participation Rate / Dosage
    Context: Implementation narrative, student qualitative narrative, coverage / bias
    Random Assignment
  • 11. Pima’s Logic Model
  • 12. Contact information
    Lashawn Richburg-Hayes, Ph.D.,
    Deputy Director
    Young Adult and Postsecondary Education
    For more information about MDRC’s work and to download our publications, go to: