Digging Deeper Into “Why” and “How”:<br />Evaluation design ideas to further increase knowledge about programs that work f...
“Guidance” is a catch-all…<br />Academic advising and counseling<br />	(e.g. orientation, info. on navigating the college,...
3<br />More intensive / “heavier touch”<br />Less intensive / “lighter touch”<br />
Striking similarity of results<br />4<br />
Questions…<br />What is the theory of change? <br />How well were interventions implemented? <br />What components matter ...
What does a lack of impact imply about whether  the program works?<br />6<br />
Ultimate Goal: Avoid “Black Box” Evaluation<br />“Simple…black box evaluations may provide a gross assessment of whether o...
Helpful solutions…<br />Driver diagrams<br />Intervention strength and treatment contrast<br />Logic models<br />Implement...
Driver diagram<br />9<br />
Strength & Treatment contrast<br />10<br />Planned/expected services<br />Offered services<br />Received services<br />Rec...
Pima’s Logic Model<br />11<br />
Contact information<br />Lashawn Richburg-Hayes, Ph.D., <br />Deputy Director<br />Young Adult and Postsecondary Education...
Upcoming SlideShare
Loading in...5
×

Digging Deeper Into “Why” and “How”: Evaluation design ideas to further increase knowledge about programs that work for students

879

Published on

From the Penn IUR and Penn GSE sponsored conference:

“Preparing Today’s Students for Tomorrow’s Jobs in Metropolitan America: The Policy, Practice and Research Issues"

May 25-26, 2011

Organized by Laura Perna, a professor in Penn GSE, and Susan Wachter, a professor in Penn’s Wharton School, “Preparing Today’s Students for Tomorrow’s Jobs” explores the most effective institutional and public-policy strategies to be sure high school and college students and adult learners have the knowledge and skills required for future employment.

“The conference addresses such critical questions as: How do we define success with regard to the role of education in preparing students for work?” Perna said. “How well are different educational providers preparing future workers? What is the role of public policy in improving connections between education and work?

“It seeks to improve our understanding of several fundamental dimensions of this issue through insights from federal, state and local policy leaders, college administrators and researchers.”

Guest speakers include Eduardo Ochoa, assistant secretary of the U.S. Department of Education; former Pennsylvania Gov. Edward Rendell; Lori Shorr, chief education officer to Philadelphia Mayor Michael Nutter; Charles Kolb from the Committee for Economic Development in Washington, D.C.; Claudia Neuhauser from the University of Minnesota; Bethany Krom from the Mayo Clinic; and Harry Holzer from Georgetown University.

“Much recent attention focuses on the need to improve high school graduation and college degree completion. But, relatively less attention has focused on whether graduates and degree recipients have the skills and education required by employers,” Perna said.

The event is sponsored by the Penn’s Pre-Doctoral Training Program in Interdisciplinary Methods for Field-Based Research in Education, with funding from the U.S. Department of Education’s Institute for Education Sciences in collaboration with Penn’s Institute for Urban Research.

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
879
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Impact, low fidelity: (What could be strengthened? What was added/omitted that wasn’t planned?)No impact, high fidelity: (Was theory wrong? Intervention too weak?)No impact, low fidelity: (Was theory wrong? Was theory right, but implementation too poor?)
  • Transcript of "Digging Deeper Into “Why” and “How”: Evaluation design ideas to further increase knowledge about programs that work for students"

    1. 1. Digging Deeper Into “Why” and “How”:<br />Evaluation design ideas to further increase knowledge about programs that work for students<br />Lashawn Richburg-Hayes<br />Preparing Today’s Students for Tomorrow’s Jobs in Metropolitan America<br />University of Pennslyvania, May 25, 2011<br />
    2. 2. “Guidance” is a catch-all…<br />Academic advising and counseling<br /> (e.g. orientation, info. on navigating the college, assistance in course selection…)<br />Academic supports and “coaching”<br /> (e.g. tutoring, remedial assistance…)<br />Personal guidance and counseling<br /> (e.g. crisis intervention, info & referral…)<br />Career counseling<br /> (e.g. aptitude assessments, career planning…)<br />Supplemental services<br /> (childcare subsidies, transportation, books…)<br />2<br />
    3. 3. 3<br />More intensive / “heavier touch”<br />Less intensive / “lighter touch”<br />
    4. 4. Striking similarity of results<br />4<br />
    5. 5. Questions…<br />What is the theory of change? <br />How well were interventions implemented? <br />What components matter most? In-person contact or structured connection? How much dosage is enough?<br />Why are there differences by gender?<br />Cost effectiveness<br />5<br />
    6. 6. What does a lack of impact imply about whether the program works?<br />6<br />
    7. 7. Ultimate Goal: Avoid “Black Box” Evaluation<br />“Simple…black box evaluations may provide a gross assessment of whether or not a program works but fail to identify the underlying causal mechanisms that generate the treatment effects, thus failing to pinpoint the deficiencies of the program for future program improvement of development.” (Chen, 1990)<br />7<br />
    8. 8. Helpful solutions…<br />Driver diagrams<br />Intervention strength and treatment contrast<br />Logic models<br />Implementation fidelity<br />8<br />
    9. 9. Driver diagram<br />9<br />
    10. 10. Strength & Treatment contrast<br />10<br />Planned/expected services<br />Offered services<br />Received services<br />Received<br />ControlServices<br />Offered<br />Control Services<br />Expected <br />Control Services<br />Strength of program and control services measured<br />Treatment Contrast<br />Fidelity to design<br />Planned <br />InterventionServices<br />Received Intervention Services<br />Offered<br />Intervention Services<br />Strength of program defined<br />Participation Rate / Dosage<br />Context: Implementation narrative, student qualitative narrative, coverage / bias<br />Random Assignment<br />
    11. 11. Pima’s Logic Model<br />11<br />
    12. 12. Contact information<br />Lashawn Richburg-Hayes, Ph.D., <br />Deputy Director<br />Young Adult and Postsecondary Education<br />rhayes@mdrc.org<br />212-340-7598<br />12<br />For more information about MDRC’s work and to download our publications, go to: http://www.mdrc.org<br />

    ×