SlideShare a Scribd company logo
icfi.com |
Evaluation 2013: The State of Evaluation
Practice in the Early 21st Century
27th Annual Conference of the American Evaluation Association
Washington, DC, USA
Design Considerations in Evaluating
the Implementation of College
Access, College Readiness, and
Career Pathway Initiatives
Thomas Horwood, Chair
Barbara O’Donnel, Discussant
October 18, 2013 | 4:30 - 6:00 PM
icfi.com |
Determining Dosage: Evaluating
the Implementation of a State
GEAR UP Initiative
Ashley Briggs
Charles Dervarics
October 18, 2013
Presented to:
American Evaluation Association
2013 Conference
icfi.com | 3
GEAR UP Nationally
 First grants funded in 1999 – Originally based on I Have a Dream program
concept
 Cohort approach – Most grantees follow a cohort of students from 7th grade
through to postsecondary education, with support services (tutoring,
mentoring, college visits) available
 FY12 – 132 federal grants serving 647,772 students
 Multiple award types – State grants and local partnership grants
OVERVIEW
icfi.com | 4
About the Texas GEAR UP State Grant (SG)
OVERVIEW
 FY12 state grant from USED – approx. $5M per year (2012 to 2019)
 Focused on a single cohort of students starting in Grade 7 (students are in
Grade 8 in 2013–14)
 Includes district and statewide services
 District services
– Support schools in four districts (7 MS  5 HS) to increase academic rigor
– Increase number of Grade 8 students succeeding in Algebra I (short-term goal)
– Provide teacher professional development to support delivery of rigorous courses
(such as Pre-AP training)
– Provide teacher professional development to support postsecondary goals (financial
literacy)
– Promote vertical alignment of core subject teachers across the grades
– Support college visits, summer learning opportunities, and tutoring services
icfi.com | 5
About the Texas GEAR UP State Grant (SG) (cont’d)
OVERVIEW
 Statewide Services
– Postsecondary information dissemination to students and families statewide
– Active, in-depth web site with information for students and families
– Online communication and teaching platform available statewide
– Statewide coalition of GEAR UP grantees (including local partnership grants not
directly under the SG)
 TEA GEAR UP Partners
– The University of Texas at Austin’s Institute for Public School Initiatives [IPSI]
– TG [Texas Guaranteed Student Loan Corporation]
– College Board
– AMS Pictures
icfi.com | 6
About the Texas GEAR UP SG Evaluation
EVALUATION DESIGN AND METHODOLOGY
The external evaluation is a longitudinal 7-year study using a quasi-
experimental design that started in January 2013 to:
 Provide ongoing formative evaluation of facilitators/barriers, promising
practices, and recommended next steps
 Explore implementation status, trends in the mix of implementation, and
relationships between implementation and outcomes
 Determine impact including short, intermediate, and long-term student
outcomes
 Identify impact on relevant family, school, and community partnership
outcomes
 Examine access to and use of statewide opportunities
 Understand cost, spending, and sustainability
icfi.com | 7
Data Sources
EVALUATION DESIGN AND METHODOLOGY
 Extant Data
– Documents: Texas GEAR UP SG Grant Application, Notice and Grant Awards
(NOGAs), and implementation plans
– Student level data: Demographics, attendance, high school course
completion and high school completion, school personnel, and district
organizational information
– School level data: Profile information about campus-level performance,
staff, finances, and programs
 Student Tracking System (Annual Performance Report – APR)
– Format: Submission by 4 subgrantee districts using a prepopulated
spreadsheet
– Topics: Advanced course-taking; Academic services; Student services;
Student events and attendance; Parent events and attendance; Teacher
professional development and enrollment; Community partners
icfi.com | 8
Data Sources (cont.)
EVALUATION DESIGN AND METHODOLOGY
 Surveys with Parents and Students
– Format: Online and paper-based versions in English and Spanish
– Topics: Aspirations and expectations; Knowledge of financial aspects;
Knowledge of college requirements; Perceptions of Texas GEAR UP SG
 Site Visits to Texas GEAR UP SG Schools
– Format: 1-1.5-day visits including interviews and focus groups with school
staff, teachers, students, parents, and community partners
– Topics: GEAR UP activities and events (school and statewide); Knowledge of
college requirements and financial aspects; Perceptions of Texas GEAR UP
SG; Readiness for success in college
 Interviews with Key Leaders from TEA and Partner Organizations
– Format: Telephone interviews
– Topics: Level of partner involvement; Perceptions of program; Progress on
statewide implementation
icfi.com | 9
Initial Analysis: Implementation
ANALYSIS
 Data Source: Student tracking system (APR) and site visits
 Primary Analysis: Descriptive statistics on participation, dosage (number of
hours, events), and mix (range of services/activities); disaggregation by school,
subject area, and format (virtual or in-person)
Implementation Strategy A B C D E F G
Adv. Course X X X X X X X
SSS: Tutoring X X X X X (math) X (math) X
SSS: Mentoring X X
SSS: Counseling/Advising X
SSS: Other Activities X (math) X (math)
College Visit X X X X
Job Site Visit X
Student Events X X X X X X
Parent Events X X X X X
Teacher PD X X X X
Community Partners X X X X
Use Statewide Services X X X
Total 4 6 5 5 8 7 11
icfi.com | 10
Initial Analysis: Plans, Knowledge, and Perceptions
ANALYSIS
 Data Source: Student and parent surveys
 Primary Analysis
– Descriptive statistics (frequencies, averages, ranges)
– Crosstabs (chi-square analyses comparing frequency distribution by
subgroup)
– Analysis of variance (ANOVA) comparing means by subgroup
– Correlation
 Key Baseline Takeaway: Both parent and student aspirations often exceeded
expectations, suggesting they are concerned about being able to achieve their
educational dreams.
 Key Baseline Takeaway: Few students or parents perceive themselves as very
knowledgeable, which can potentially be changed by participation in Texas
GEAR UP SG.
 Key Baseline Takeaway: Student overall satisfaction with Texas GEAR UP SG
was highest at one school, where 41% of students indicated they were very
satisfied.
icfi.com | 11
Initial Analysis: Costs and Lessons Learned
ANALYSIS
 Cost
– Data Source: Budgets and reported draw downs
– Primary Analysis: Descriptive statistics, breakdown by cost categories
 Facilitators and Barriers
– Data Source: Survey and site visit data
– Primary Analysis: Descriptive statistics, analysis of open-ended survey
responses, qualitative analysis
– Key Baseline Takeaway: Parents reported that engagement in activities is
facilitated when topics are of interest to them, when events are held at
times appropriate for their schedule, and when their student is also
engaged.
 Potentially Promising Practices
– Data Source: Site visit data
– Primary Analysis: Qualitative analysis
– Key Baseline Takeaway: Early successes at some schools related to
afterschool mathematics programs, enhanced college visits, and family
events.
icfi.com | 12
Forthcoming Analysis beyond Year 1
ANALYSIS
 Level and Mix of Implementation: Analysis of various service factors
– Provision type (virtual or on-line)
– Frequency of delivery (number of hours, number of sessions)
– Mix of services (e.g., enrollment in and tutoring in an advanced course)
– Quality of implemented activities
 Plans, Knowledge, and Perceptions: Disaggregation by student characteristics
– Gender, race/ethnicity, LEP status, special education status
– Participation in advanced coursework
 Cost
– Descriptive analysis of actual expenditures (annual and cumulative) by cost
category
 Types of Analysis
– HLM (with student, school, and district levels) and cluster analysis
– Impact analysis using extant outcome data
– Comparisons using PSM
– Linkages between outcomes and implementation
– Change in implementation over time
– Relationship of actual implementation and proposed plans
icfi.com | 13
Lessons About This Evaluation from Year 1
LESSONS
 Caution interpretation based on the period of data collection.
 Use crosswalk to address 60+ evaluation questions.
 Ensure common definitions of program services.
 Consider ways to verify information across data sources.
 Maximize the use of online surveys.
 Leverage various strategies to obtain sufficient parent response rates.
 Analyze data at multiple levels (school and district).
 Utilize district-level case studies to understand the context in which
implementation occurs.
14
O’Donnel, B., Briggs, A., Dervarics, D., Horwood, T., Sun, J., Alexander, A.,
Zumdahl, J., & Rhodes, J. (2013, September). Annual Implementation Report #1:
Texas GEAR UP State Grant Evaluation. Report prepared for the Texas Education
Agency by ICF International. Available online at:
http://www.tea.state.tx.us/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=2576980765
9&libID=25769807662#25769807659
For more information, the report is publicly available:
icfi.com | 15
Assessing the Fidelity of
Implementation in the
Diplomas Now Evaluation
October 18, 2013
Presented to:
American Evaluation Association
2013 Conference
Felix Fernandez
Aracelis Gray
icfi.com | 16
Overview of Diplomas Now Study
Diplomas Now is school turnaround model that unites three
organizations – Talent Development, City Year, and Communities
In Schools
A random assignment design
Sixty-two schools in 11 districts across the country participating
in the study.
Study will compare student outcomes in the 32 middle and high
schools that implement DN to those in the 30 schools that do
not.
icfi.com | 17
Overview of Diplomas Now Implementation Study
 Overall goal: document implementation in the 32 DN schools.
 Research Questions:
– How much variation in implementation fidelity was there across sites?
– What were the largest challenges to implementing the DN model?
– What were the most important ways in which the intervention as implemented
differed from the intervention as planned?
icfi.com | 18
DN Fidelity of Implementation
 Fidelity of implementation is based on the DN Logic Model and measured by
the Fidelity of Implementation Matrix.
 The matrix is made up of 111 separate components, 62 of which were
identified as critical to adequate implementation.
 The fidelity matrix consist of 9 inputs ranging from program staff training,
family and community involvement and student supports.
icfi.com | 19
DN Fidelity of Implementation
 That is…
Input
1
Component
X
Component
Y
Component
Z
icfi.com | 20
DN Fidelity of Implementation
 And…
Overall
Fidelity
Input
1
Input
2
Input
3
Input
4
Input
5 Input
6
Input
7
Input
8
Input
9
icfi.com | 21
icfi.com | 22
Overview of Fidelity Matrix
– Program Staff Training and Professional Development
• 18 individual components, 15 of which are critical
– Integrated On-Site Support (Critical Input)
• 11 individual components, 9 of which are critical
– Family and Community Involvement
• 6 individual components, 1 of which is critical
– Tiered Intervention Model (Critical Input)
• 3 individual components, 2 of which are critical
– Strong Learning Environments (Critical Input)
• 6 individual components, 4 of which are critical
– Includes 1 MS and 1 HS specific critical component
icfi.com | 23
Overview of Fidelity Matrix
– Professional Development and Peer Coaching
(Critical Input)
• 5 individual components, 2 of which are critical
– Includes 1 HS specific component
– Curriculum for College Readiness
• 24 individual components, 4 of which are critical
– Includes 7 MS and 17 HS specific components
– Student Supports (Critical Input)
• 24 individual items, 19 of which are critical
– Student Case Management (Critical Input)
• 14 individual items, 5 of which are critical
icfi.com | 24
DN Fidelity of Implementation
 Divided into two metrics, a categorical rating and a continuous score:
1. Implementation Rating (categorical measure): focus on critical components
2. Implementation Score (continuous measure): allow for assessment of greater
variability between sites
 Together they provide the flexibility to:
– Look at categories that emerge
– See if scores vary by rating, amount of overlap, etc.
– Study relationships between implementation and outcomes
icfi.com | 25
Data Sources
 Fidelity of Implementation Data stem from the following sources:
– Diplomas Now Implementation Support Team (DNIST) Survey
– School Transformation Facilitator (STF) Survey
– Citi Year Program Manager (CYPM) Survey
– Communities In Schools (CIS) Site Coordinator (SC) Survey
– Communities In Schools (CIS) Site Records
icfi.com | 26
Fidelity of Implementation: Strong Learning Environments
Component Operational definition Fidelity Scale Criterion Critical Sample Response
Strong Learning Environments
Small Learning
Communities
Teams of Teachers working with the
same small group of students
0: No
1: Yes
1: Adequate/
High Fidelity
YES 1:Yes
Interdisciplinary
Teams
Frequency of Interdisciplinary team
meeting
0: do not /rarely occur
1: occur monthly
2: occur bi-weekly
3: occur weekly
4:occur multiple times a
week
5: occur daily
4: Adequate
5: High Fidelity
YES 4:occur multiple
times a week
DN Site-Based
Meeting
Admin, STF, Program Manager, Site
Coordinator hold brief review of
program implementation (approx. 30
minutes)
0: Once a month or less
1: Biweekly
2: Weekly or more
frequently
1=Adequate,
2= High
Fidelity
YES 0: Once a month
or less
DN Site-Based
Collaboration
Site based collaborative (Admin, STF,
PM, Site Coordinator) have norms for
collaboration, standards for
communication, and frameworks for
decision making
0: No
1: Partially/In Process
2: Yes
1=Adequate,
2= High
Fidelity
No 1: Partially/In
Process
4x4 Block (High
School Only
Question)
Four 75-90 minute class periods that
meet daily
0: No
1: Hybrid/Acceptable
Alternative
2: Yes
1= Adequate
2= Highly
Fidelity
YES 0: No
icfi.com | 27
Implementation Rating
The implementation rating focused on the “Critical to
Fidelity/Adequate Rating” column within the fidelity matrix.
Using this column each input (e.g., program staff professional
development) of the DN model was provided with one of two
ratings:
1. “Successful” - have met all components identified as critical
2. “Developing” - did not meet one or more critical components
In addition to critical components, critical inputs have also
been identified (i.e., inputs critical to an adequate
implementation).
icfi.com | 28
Implementation Rating
 Individual input ratings served as the basis for the site-level fidelity rating,
which has been broken up into four parts:
1. Low: successful on less than 3 critical inputs
2. Moderate: successful on at least 3 critical inputs
3. Solid: successful on at least 5 critical inputs
4. High: successful on 8 or more inputs including 5 critical inputs
icfi.com | 29
Example: Implementation Rating
 Implementation Rating only takes into account components identified as
critical. In this case:
– Teams of teachers working with the same small group of students
– Frequency of interdisciplinary team meetings
– DN site-based meeting
– 4x4 Block
 Given our sample responses this site has met the criterion for adequate
implementation for teams of teachers and frequency of interdisciplinary team
meetings but not DN site-based meetings or 4x4 classroom blocks.
 It would therefore receive an implementation rating of “Developing” on this
input.
icfi.com | 30
Program Staff Professional Development
DN Fidelity Implementation Rating Flowchart
Successful
Developing
Integrated On-Site Support*
Tiered Intervention Model*
Professional Development and Peer Coaching*
Student Supports*
Student Case Management*
Family and Community Involvement
Strong Learning Environments*
Curriculum for College Readiness
Successful
Successful
Successful
Successful
Successful
Successful
Successful
Successful
Developing
Developing
Developing
Developing
Developing
Developing
Developing
Solid Implementation
(successful on 5 critical inputs)
High Implementation
(successful on 8+ inputs)
ModerateImplementation
(successfulonatleast3criticalinputs)
LowImplementation
(successfulonlessthan3criticalinputs)
* indicates critical inputs
Developing
icfi.com | 31
Implementation Score
The implementation score focused on the “Fidelity Scale”
column within the fidelity matrix.
Using this column each site received an input score,
calculated as the equally weighted sum of the site’s
fidelity scales divided by the total number of
components.
The average of the 9 individual input scores then
formed the site-level implementation score.
icfi.com | 32
Example: Implementation Score
 Implementation scores are calculated by taking the sum of the weighted
response divided by the total number of components.
– Scale scores are equally weighted, for example, a component scaled 0-2 would be
recoded 0=0, 1=.5, and 2=1
 Adding up the weighted fidelity scale responses would equal 2.3
(1+.8+0+.5+0).
 There are 5 Strong Learning Environments components.
 The site’s implementation score for this input would then equal 2.3 divided 5
or .46.
icfi.com | 33
Program Staff Professional Development
Integrated On-Site Support
Tiered Intervention Model
Professional Development and Peer Coaching
Student Supports
Student Case Management
Family and Community Involvement
Strong Learning Environments
Curriculum for College Readiness
X / 18
Inputs
Input
Score
(X / 18) / 9
X / 11 (X / 11) / 9
X / 6 (X / 6) / 9
X / 3 (X / 3) / 9
X / 5 (X / 5) / 9
X / 5 (X / 5) / 9
X / 17 (X / 17) / 9
X / 24 (X / 24) / 9
X / 14 (X / 14) / 9
Site-Level Score
+
+
+
+
+
+
+
+
Note: X is the equally weighted sum of fidelity scale components. Sample calculations provided are only for
HS data.
DN Fidelity Implementation Score Flowchart
icfi.com | 34
Fidelity of Implementation
 Independently, each measure provides useful but different information.
 Together, they provide flexibility in understanding implementation, allow for
detailed discussion of site fidelity, and help to shape an implementation
story.
icfi.com | 35
Evaluating a Career Pathways
Research Initiative
PathTech: Successful Academic and
Employment Pathways in Advanced
Technology
Kristen Peterson
October 18, 2013
Presented to:
American Evaluation Association
2013 Conference
icfi.com | 36
The PathTech Program
icfi.com | 37
Background on the PathTech Project
 Funded through a grant from the National Science Foundation (NSF) under
the Advanced Technological Education (ATE) Program
 ATE promotes the improvement of education for science and engineering
technicians entering high-technology fields
 The ATE program supports many different types of activities including:
– Articulation between two-year and four-year programs
– Career pathways
– Curriculum development
– Educator professional development
– General research advancing the understanding of educating technicians for careers in
high-technology fields
icfi.com | 38
Background on the PathTech Project
 Successful Academic and Employment Pathways in Advanced Technologies
(PathTech)
– A research study examining the progression of students from high school into
advanced technology programs and into the workforce
– A four-year study currently entering the third year of the project
 Collaborative grant awarded to the University of South Florida (USF)
– Grant partnership includes USF researches, the Florida Advanced Technological
Education Center (FLATE) and four south Florida Community Colleges
• Hillsborough Community College
• Polk State College
• St. Petersburg College
• State College of Florida
icfi.com | 39
PathTech Research Design and Evaluation
icfi.com | 40
PathTech Research Questions
1. Who enrolls in engineering technology (ET) programs out of high school?
– How are student demographic and academic characteristics related to ET
enrollment?
– How do students learn about ET programs?
– How can the pathway from high school into ET programs be improved?
2. How do ET students benefit from enrolling (in degree programs) and
earning degrees through these programs?
– What are the most critical steps in ET degree attainment from enrollment through
gatekeeper courses and to the degree?
– How do these students become ET program graduates?
– How do the ET students differ from comparable students in their degree and
employment outcomes?
icfi.com | 41
Design Considerations for PathTech
 Mixed-methods study
 Quantitative Data and Analysis:
– Descriptive statistics and empirical analysis with quantitative data from state
databases
 Qualitative Data and Analysis:
– Ethnographic and qualitative analyses of engineering technology programs
– Three data sources:
• interviews with community college students,
• Interviews with students at feeder high schools,
• Interviews with local industry partners
icfi.com | 42
Evaluation Approach
 External evaluation of PathTech complements and supports the efforts of the
PathTech research team and involves:
1. Monitoring the progress of various aspects of the project
2. Providing objective reviews of project instruments, plans, reports and
other materials
3. Serving as an external resource for technical advice
 Designed a flexible evaluation model to account for a dynamic work plan
– Developed an annual workplan tracker
– Update the workplan tracker monthly
– Facilitate monthly status calls with the research team
icfi.com | 43
PathTech Evaluation Challenges
icfi.com | 44
Evaluation Challenges: Quantitative Data Analysis
 Using data from the Florida Department of Education’s Florida Education and
Training Placement Information Program (FETPIP)
– Initial requests for data went unanswered
– New policy to not release FETPIP employment data in conjunction with supplemental
educational data that includes demographic data
 Pursuing data from other sources including from the National Academy
Foundation (NAF)
– The NAF is a network of over 500 career-themed academies across the country which
include engineering as a career theme.
– The data would provide longitudinal, national, state and regional student-level data
including data on academic performance, demographic characteristics and academy
assessments.
icfi.com | 45
Evaluation Challenges: Qualitative Data Collection
 Multiple sources of qualitative data including high schools with a focus on
engineering technology, community colleges and engineering technology
industry partners
– Initial pilot study at one community college, high school, and industry
– Data collection and analyses are underway for community college and industry
employers and recruiters
– Challenge recruiting high schools with relevant engineering technology programs
 Participant and stakeholder buy-in across sites
– Particular challenge among area high schools
icfi.com | 46
Evaluation Challenges: External Evaluation Considerations
 Challenge: Monitoring progress with a flexible project design and dynamic
work plan
– Tracking changing timelines
– Reporting on initial and updated benchmarks
 Response: Task management and evaluation tools
– Monthly progress meetings
– Flexible workplan task tracker, designed to accommodate high-level and individual
task changes
icfi.com | 47
Workplan Tracker
icfi.com | 48
Current Project Status and Next Steps
 Qualitative pilot studies have been completed with high school students,
community college students, and industry partners
– Pilot interviews were transcribed and thematically coded
– Initial findings indicate three main factors influencing engineering technology pathways
and several pathways leading into engineering technology fields
 Interview protocols have been finalized and community college students have
been recruited into the study
 20 in-depth interviews with industry members were completed, and
transcriptions and analysis are underway
 Ongoing challenges obtaining quantitative data; pursing alternate data sources
 Multiple publications in progress
 Continue to monitor the research team’s progress and help overcome barriers
icfi.com | 49
Contact Information
Program Evaluation
 Barbara O’Donnel, Principal, Barbara.ODonnel@icfi.com
 T.J. Horwood, Senior Manager, Thomas.Horwood@icfi.com
Texas GEAR UP State Grant
 Ashley Briggs, Senior Associate, Ashley.Briggs@icfi.com
 Chuck Dervarics, Expert Consultant, Chuck.Dervarics@icfi.com
Diplomas Now
 Felix Fernandez, Technical Specialist, Felix.Fernandez@icfi.com
 Aracelis Gray, Senior Manager, Aracelis.Gray@icfi.com
PathTech
 Kristen Peterson, Senior Associate, Kristen.Peterson@icfi.com

More Related Content

What's hot

Needs Assessment Powerpoint 2007
Needs Assessment Powerpoint 2007Needs Assessment Powerpoint 2007
Needs Assessment Powerpoint 2007
Johan Koren
 
Building Data Literacy Among Middle School Administrators and Teachers
Building Data Literacy Among Middle School Administrators and TeachersBuilding Data Literacy Among Middle School Administrators and Teachers
Building Data Literacy Among Middle School Administrators and Teachers
North Carolina Association for Middle Level Education
 
Ensuring Opportunity Summary
Ensuring Opportunity SummaryEnsuring Opportunity Summary
Ensuring Opportunity Summary
Mebane Rash
 
Wsu Ppt Building District Data Capacity
Wsu Ppt Building District Data CapacityWsu Ppt Building District Data Capacity
Wsu Ppt Building District Data Capacity
Glenn E. Malone, EdD
 
Macfadyen usc tlt keynote 2015.pptx
Macfadyen usc tlt keynote 2015.pptxMacfadyen usc tlt keynote 2015.pptx
Macfadyen usc tlt keynote 2015.pptx
Leah Macfadyen
 
Prince cv may 2016
Prince cv may 2016Prince cv may 2016
Prince cv may 2016
Heath Prince
 
Career Planning Ma Session
Career Planning Ma SessionCareer Planning Ma Session
Career Planning Ma Session
Karen A. DeCoster
 
Key Findings from Focus Groups with College Students
Key Findings from Focus Groups with College StudentsKey Findings from Focus Groups with College Students
Key Findings from Focus Groups with College Students
Robert Kelly
 
Understanding the U.S. News & World Report “Best Colleges” 2007 - Handout
Understanding the U.S. News & World Report “Best Colleges” 2007 - HandoutUnderstanding the U.S. News & World Report “Best Colleges” 2007 - Handout
Understanding the U.S. News & World Report “Best Colleges” 2007 - Handout
Matthew Hendrickson
 
Missouri ACT Identified Keys to Enrollment Success
Missouri ACT Identified Keys to Enrollment SuccessMissouri ACT Identified Keys to Enrollment Success
Missouri ACT Identified Keys to Enrollment Success
StephaneGeyer
 
Resume.docx
Resume.docxResume.docx
Resume.docx
Davin Johnson
 
Linked in thomas v. millington resume_2021.docx
Linked in thomas v. millington resume_2021.docxLinked in thomas v. millington resume_2021.docx
Linked in thomas v. millington resume_2021.docx
Tom Millington
 
Wsu District Capacity Of Well Crafted District Wide System Of Support
Wsu District Capacity Of Well Crafted District Wide System Of SupportWsu District Capacity Of Well Crafted District Wide System Of Support
Wsu District Capacity Of Well Crafted District Wide System Of Support
WSU Cougars
 
Data and Assessment in Academic Libraries: Linking Freshmen Student Success a...
Data and Assessment in Academic Libraries: Linking Freshmen Student Success a...Data and Assessment in Academic Libraries: Linking Freshmen Student Success a...
Data and Assessment in Academic Libraries: Linking Freshmen Student Success a...
Georgia Libraries Conference (formerly Ga COMO).
 
From Data To Information Perspectives On Policy And Practice
From Data To Information  Perspectives On Policy And PracticeFrom Data To Information  Perspectives On Policy And Practice
From Data To Information Perspectives On Policy And Practice
Jeff_Watson
 
CCSA 2015 225. Increasing the Teacher's Effectiveness Toolbox
CCSA 2015 225. Increasing the Teacher's Effectiveness ToolboxCCSA 2015 225. Increasing the Teacher's Effectiveness Toolbox
CCSA 2015 225. Increasing the Teacher's Effectiveness Toolbox
joniallison23
 
Presentación Justine Hustings (Brown University)
Presentación Justine Hustings (Brown University)Presentación Justine Hustings (Brown University)
Presentación Justine Hustings (Brown University)
Ceppe Chile
 
Selecting the Most Important Predictors of Computer Science Students' Online ...
Selecting the Most Important Predictors of Computer Science Students' Online ...Selecting the Most Important Predictors of Computer Science Students' Online ...
Selecting the Most Important Predictors of Computer Science Students' Online ...
Qiang Hao
 
Standing at the Intersection of College Choice and Competency-based Education...
Standing at the Intersection of College Choice and Competency-based Education...Standing at the Intersection of College Choice and Competency-based Education...
Standing at the Intersection of College Choice and Competency-based Education...
American Public University System
 
Improving Outcomes for All Students: Strategies and Considerations to Increas...
Improving Outcomes for All Students: Strategies and Considerations to Increas...Improving Outcomes for All Students: Strategies and Considerations to Increas...
Improving Outcomes for All Students: Strategies and Considerations to Increas...
Mohammed Choudhury
 

What's hot (20)

Needs Assessment Powerpoint 2007
Needs Assessment Powerpoint 2007Needs Assessment Powerpoint 2007
Needs Assessment Powerpoint 2007
 
Building Data Literacy Among Middle School Administrators and Teachers
Building Data Literacy Among Middle School Administrators and TeachersBuilding Data Literacy Among Middle School Administrators and Teachers
Building Data Literacy Among Middle School Administrators and Teachers
 
Ensuring Opportunity Summary
Ensuring Opportunity SummaryEnsuring Opportunity Summary
Ensuring Opportunity Summary
 
Wsu Ppt Building District Data Capacity
Wsu Ppt Building District Data CapacityWsu Ppt Building District Data Capacity
Wsu Ppt Building District Data Capacity
 
Macfadyen usc tlt keynote 2015.pptx
Macfadyen usc tlt keynote 2015.pptxMacfadyen usc tlt keynote 2015.pptx
Macfadyen usc tlt keynote 2015.pptx
 
Prince cv may 2016
Prince cv may 2016Prince cv may 2016
Prince cv may 2016
 
Career Planning Ma Session
Career Planning Ma SessionCareer Planning Ma Session
Career Planning Ma Session
 
Key Findings from Focus Groups with College Students
Key Findings from Focus Groups with College StudentsKey Findings from Focus Groups with College Students
Key Findings from Focus Groups with College Students
 
Understanding the U.S. News & World Report “Best Colleges” 2007 - Handout
Understanding the U.S. News & World Report “Best Colleges” 2007 - HandoutUnderstanding the U.S. News & World Report “Best Colleges” 2007 - Handout
Understanding the U.S. News & World Report “Best Colleges” 2007 - Handout
 
Missouri ACT Identified Keys to Enrollment Success
Missouri ACT Identified Keys to Enrollment SuccessMissouri ACT Identified Keys to Enrollment Success
Missouri ACT Identified Keys to Enrollment Success
 
Resume.docx
Resume.docxResume.docx
Resume.docx
 
Linked in thomas v. millington resume_2021.docx
Linked in thomas v. millington resume_2021.docxLinked in thomas v. millington resume_2021.docx
Linked in thomas v. millington resume_2021.docx
 
Wsu District Capacity Of Well Crafted District Wide System Of Support
Wsu District Capacity Of Well Crafted District Wide System Of SupportWsu District Capacity Of Well Crafted District Wide System Of Support
Wsu District Capacity Of Well Crafted District Wide System Of Support
 
Data and Assessment in Academic Libraries: Linking Freshmen Student Success a...
Data and Assessment in Academic Libraries: Linking Freshmen Student Success a...Data and Assessment in Academic Libraries: Linking Freshmen Student Success a...
Data and Assessment in Academic Libraries: Linking Freshmen Student Success a...
 
From Data To Information Perspectives On Policy And Practice
From Data To Information  Perspectives On Policy And PracticeFrom Data To Information  Perspectives On Policy And Practice
From Data To Information Perspectives On Policy And Practice
 
CCSA 2015 225. Increasing the Teacher's Effectiveness Toolbox
CCSA 2015 225. Increasing the Teacher's Effectiveness ToolboxCCSA 2015 225. Increasing the Teacher's Effectiveness Toolbox
CCSA 2015 225. Increasing the Teacher's Effectiveness Toolbox
 
Presentación Justine Hustings (Brown University)
Presentación Justine Hustings (Brown University)Presentación Justine Hustings (Brown University)
Presentación Justine Hustings (Brown University)
 
Selecting the Most Important Predictors of Computer Science Students' Online ...
Selecting the Most Important Predictors of Computer Science Students' Online ...Selecting the Most Important Predictors of Computer Science Students' Online ...
Selecting the Most Important Predictors of Computer Science Students' Online ...
 
Standing at the Intersection of College Choice and Competency-based Education...
Standing at the Intersection of College Choice and Competency-based Education...Standing at the Intersection of College Choice and Competency-based Education...
Standing at the Intersection of College Choice and Competency-based Education...
 
Improving Outcomes for All Students: Strategies and Considerations to Increas...
Improving Outcomes for All Students: Strategies and Considerations to Increas...Improving Outcomes for All Students: Strategies and Considerations to Increas...
Improving Outcomes for All Students: Strategies and Considerations to Increas...
 

Similar to ICF_AEA_multipaper

MD8Assgn: A8: Course Project—Program Proposal
MD8Assgn: A8: Course Project—Program ProposalMD8Assgn: A8: Course Project—Program Proposal
MD8Assgn: A8: Course Project—Program Proposal
eckchela
 
Harnessing Decentralized Data to Improve Advising and Student Success - NASPA...
Harnessing Decentralized Data to Improve Advising and Student Success - NASPA...Harnessing Decentralized Data to Improve Advising and Student Success - NASPA...
Harnessing Decentralized Data to Improve Advising and Student Success - NASPA...
Naviance
 
Week One - Why Data?
Week One - Why Data?Week One - Why Data?
Week One - Why Data?
Rich Parker
 
Disrupted Futures 2023 | Learning from large-scale, longitudinal datasets
Disrupted Futures 2023 | Learning from large-scale, longitudinal datasetsDisrupted Futures 2023 | Learning from large-scale, longitudinal datasets
Disrupted Futures 2023 | Learning from large-scale, longitudinal datasets
EduSkills OECD
 
Assessing the Impact of Mentoring: Lessons Learned from a Research Study in W...
Assessing the Impact of Mentoring: Lessons Learned from a Research Study in W...Assessing the Impact of Mentoring: Lessons Learned from a Research Study in W...
Assessing the Impact of Mentoring: Lessons Learned from a Research Study in W...
ICF
 
Practitioner Defense 7_19_16
Practitioner Defense 7_19_16Practitioner Defense 7_19_16
Practitioner Defense 7_19_16
Tyson Holder, Ed.D., SSP, MS
 
SXSWedu 2015 NCAN Panel Deck
SXSWedu 2015 NCAN Panel DeckSXSWedu 2015 NCAN Panel Deck
SXSWedu 2015 NCAN Panel Deck
debaunb
 
Measuring What Matters: Noncognitive Skills - GRIT
Measuring What Matters: Noncognitive Skills - GRITMeasuring What Matters: Noncognitive Skills - GRIT
Measuring What Matters: Noncognitive Skills - GRIT
SmarterServices Owen
 
CypherWorx OST Effiacy Study Results 2015
CypherWorx OST Effiacy Study Results 2015CypherWorx OST Effiacy Study Results 2015
CypherWorx OST Effiacy Study Results 2015
Steve Stookey
 
GradNation Acceleration Grant Informational Webinar
GradNation Acceleration Grant Informational WebinarGradNation Acceleration Grant Informational Webinar
GradNation Acceleration Grant Informational Webinar
America's Promise Alliance
 
Focus on Student Engagement: Individual Learning Plans
Focus on Student Engagement: Individual Learning PlansFocus on Student Engagement: Individual Learning Plans
Focus on Student Engagement: Individual Learning Plans
Hobsons
 
Impacting College-Going and Completion Rates in Your Community: Taking a Syst...
Impacting College-Going and Completion Rates in Your Community: Taking a Syst...Impacting College-Going and Completion Rates in Your Community: Taking a Syst...
Impacting College-Going and Completion Rates in Your Community: Taking a Syst...
National Partnership for Educational Access
 
Promoting Effective Teaching and Learning Ecosystems via Research Proven Prac...
Promoting Effective Teaching and Learning Ecosystems via Research Proven Prac...Promoting Effective Teaching and Learning Ecosystems via Research Proven Prac...
Promoting Effective Teaching and Learning Ecosystems via Research Proven Prac...
Tanya Joosten
 
Assessment
AssessmentAssessment
Assessment
amyplayersmith
 
Credit Flexibility Presentation by Sarah Luchs
Credit Flexibility Presentation by Sarah LuchsCredit Flexibility Presentation by Sarah Luchs
Credit Flexibility Presentation by Sarah Luchs
Eric Calvert
 
A guide for comprehensive needs assessment
A guide for comprehensive needs assessmentA guide for comprehensive needs assessment
A guide for comprehensive needs assessment
k1hinze
 
A guide for comprehensive needs assessment
A guide for comprehensive needs assessmentA guide for comprehensive needs assessment
A guide for comprehensive needs assessment
k1hinze
 
Cdpi keynote
Cdpi keynoteCdpi keynote
Cdpi keynote
cdpindiana
 
Virtual Learning Policy Consideration (iNacol)
Virtual Learning Policy Consideration (iNacol)Virtual Learning Policy Consideration (iNacol)
Virtual Learning Policy Consideration (iNacol)
Blacketor Consultants, LLC
 
A Quiet Crisis
A Quiet CrisisA Quiet Crisis
A Quiet Crisis
Diana Laboy-Rush
 

Similar to ICF_AEA_multipaper (20)

MD8Assgn: A8: Course Project—Program Proposal
MD8Assgn: A8: Course Project—Program ProposalMD8Assgn: A8: Course Project—Program Proposal
MD8Assgn: A8: Course Project—Program Proposal
 
Harnessing Decentralized Data to Improve Advising and Student Success - NASPA...
Harnessing Decentralized Data to Improve Advising and Student Success - NASPA...Harnessing Decentralized Data to Improve Advising and Student Success - NASPA...
Harnessing Decentralized Data to Improve Advising and Student Success - NASPA...
 
Week One - Why Data?
Week One - Why Data?Week One - Why Data?
Week One - Why Data?
 
Disrupted Futures 2023 | Learning from large-scale, longitudinal datasets
Disrupted Futures 2023 | Learning from large-scale, longitudinal datasetsDisrupted Futures 2023 | Learning from large-scale, longitudinal datasets
Disrupted Futures 2023 | Learning from large-scale, longitudinal datasets
 
Assessing the Impact of Mentoring: Lessons Learned from a Research Study in W...
Assessing the Impact of Mentoring: Lessons Learned from a Research Study in W...Assessing the Impact of Mentoring: Lessons Learned from a Research Study in W...
Assessing the Impact of Mentoring: Lessons Learned from a Research Study in W...
 
Practitioner Defense 7_19_16
Practitioner Defense 7_19_16Practitioner Defense 7_19_16
Practitioner Defense 7_19_16
 
SXSWedu 2015 NCAN Panel Deck
SXSWedu 2015 NCAN Panel DeckSXSWedu 2015 NCAN Panel Deck
SXSWedu 2015 NCAN Panel Deck
 
Measuring What Matters: Noncognitive Skills - GRIT
Measuring What Matters: Noncognitive Skills - GRITMeasuring What Matters: Noncognitive Skills - GRIT
Measuring What Matters: Noncognitive Skills - GRIT
 
CypherWorx OST Effiacy Study Results 2015
CypherWorx OST Effiacy Study Results 2015CypherWorx OST Effiacy Study Results 2015
CypherWorx OST Effiacy Study Results 2015
 
GradNation Acceleration Grant Informational Webinar
GradNation Acceleration Grant Informational WebinarGradNation Acceleration Grant Informational Webinar
GradNation Acceleration Grant Informational Webinar
 
Focus on Student Engagement: Individual Learning Plans
Focus on Student Engagement: Individual Learning PlansFocus on Student Engagement: Individual Learning Plans
Focus on Student Engagement: Individual Learning Plans
 
Impacting College-Going and Completion Rates in Your Community: Taking a Syst...
Impacting College-Going and Completion Rates in Your Community: Taking a Syst...Impacting College-Going and Completion Rates in Your Community: Taking a Syst...
Impacting College-Going and Completion Rates in Your Community: Taking a Syst...
 
Promoting Effective Teaching and Learning Ecosystems via Research Proven Prac...
Promoting Effective Teaching and Learning Ecosystems via Research Proven Prac...Promoting Effective Teaching and Learning Ecosystems via Research Proven Prac...
Promoting Effective Teaching and Learning Ecosystems via Research Proven Prac...
 
Assessment
AssessmentAssessment
Assessment
 
Credit Flexibility Presentation by Sarah Luchs
Credit Flexibility Presentation by Sarah LuchsCredit Flexibility Presentation by Sarah Luchs
Credit Flexibility Presentation by Sarah Luchs
 
A guide for comprehensive needs assessment
A guide for comprehensive needs assessmentA guide for comprehensive needs assessment
A guide for comprehensive needs assessment
 
A guide for comprehensive needs assessment
A guide for comprehensive needs assessmentA guide for comprehensive needs assessment
A guide for comprehensive needs assessment
 
Cdpi keynote
Cdpi keynoteCdpi keynote
Cdpi keynote
 
Virtual Learning Policy Consideration (iNacol)
Virtual Learning Policy Consideration (iNacol)Virtual Learning Policy Consideration (iNacol)
Virtual Learning Policy Consideration (iNacol)
 
A Quiet Crisis
A Quiet CrisisA Quiet Crisis
A Quiet Crisis
 

ICF_AEA_multipaper

  • 1. icfi.com | Evaluation 2013: The State of Evaluation Practice in the Early 21st Century 27th Annual Conference of the American Evaluation Association Washington, DC, USA Design Considerations in Evaluating the Implementation of College Access, College Readiness, and Career Pathway Initiatives Thomas Horwood, Chair Barbara O’Donnel, Discussant October 18, 2013 | 4:30 - 6:00 PM
  • 2. icfi.com | Determining Dosage: Evaluating the Implementation of a State GEAR UP Initiative Ashley Briggs Charles Dervarics October 18, 2013 Presented to: American Evaluation Association 2013 Conference
  • 3. icfi.com | 3 GEAR UP Nationally  First grants funded in 1999 – Originally based on I Have a Dream program concept  Cohort approach – Most grantees follow a cohort of students from 7th grade through to postsecondary education, with support services (tutoring, mentoring, college visits) available  FY12 – 132 federal grants serving 647,772 students  Multiple award types – State grants and local partnership grants OVERVIEW
  • 4. icfi.com | 4 About the Texas GEAR UP State Grant (SG) OVERVIEW  FY12 state grant from USED – approx. $5M per year (2012 to 2019)  Focused on a single cohort of students starting in Grade 7 (students are in Grade 8 in 2013–14)  Includes district and statewide services  District services – Support schools in four districts (7 MS  5 HS) to increase academic rigor – Increase number of Grade 8 students succeeding in Algebra I (short-term goal) – Provide teacher professional development to support delivery of rigorous courses (such as Pre-AP training) – Provide teacher professional development to support postsecondary goals (financial literacy) – Promote vertical alignment of core subject teachers across the grades – Support college visits, summer learning opportunities, and tutoring services
  • 5. icfi.com | 5 About the Texas GEAR UP State Grant (SG) (cont’d) OVERVIEW  Statewide Services – Postsecondary information dissemination to students and families statewide – Active, in-depth web site with information for students and families – Online communication and teaching platform available statewide – Statewide coalition of GEAR UP grantees (including local partnership grants not directly under the SG)  TEA GEAR UP Partners – The University of Texas at Austin’s Institute for Public School Initiatives [IPSI] – TG [Texas Guaranteed Student Loan Corporation] – College Board – AMS Pictures
  • 6. icfi.com | 6 About the Texas GEAR UP SG Evaluation EVALUATION DESIGN AND METHODOLOGY The external evaluation is a longitudinal 7-year study using a quasi- experimental design that started in January 2013 to:  Provide ongoing formative evaluation of facilitators/barriers, promising practices, and recommended next steps  Explore implementation status, trends in the mix of implementation, and relationships between implementation and outcomes  Determine impact including short, intermediate, and long-term student outcomes  Identify impact on relevant family, school, and community partnership outcomes  Examine access to and use of statewide opportunities  Understand cost, spending, and sustainability
  • 7. icfi.com | 7 Data Sources EVALUATION DESIGN AND METHODOLOGY  Extant Data – Documents: Texas GEAR UP SG Grant Application, Notice and Grant Awards (NOGAs), and implementation plans – Student level data: Demographics, attendance, high school course completion and high school completion, school personnel, and district organizational information – School level data: Profile information about campus-level performance, staff, finances, and programs  Student Tracking System (Annual Performance Report – APR) – Format: Submission by 4 subgrantee districts using a prepopulated spreadsheet – Topics: Advanced course-taking; Academic services; Student services; Student events and attendance; Parent events and attendance; Teacher professional development and enrollment; Community partners
  • 8. icfi.com | 8 Data Sources (cont.) EVALUATION DESIGN AND METHODOLOGY  Surveys with Parents and Students – Format: Online and paper-based versions in English and Spanish – Topics: Aspirations and expectations; Knowledge of financial aspects; Knowledge of college requirements; Perceptions of Texas GEAR UP SG  Site Visits to Texas GEAR UP SG Schools – Format: 1-1.5-day visits including interviews and focus groups with school staff, teachers, students, parents, and community partners – Topics: GEAR UP activities and events (school and statewide); Knowledge of college requirements and financial aspects; Perceptions of Texas GEAR UP SG; Readiness for success in college  Interviews with Key Leaders from TEA and Partner Organizations – Format: Telephone interviews – Topics: Level of partner involvement; Perceptions of program; Progress on statewide implementation
  • 9. icfi.com | 9 Initial Analysis: Implementation ANALYSIS  Data Source: Student tracking system (APR) and site visits  Primary Analysis: Descriptive statistics on participation, dosage (number of hours, events), and mix (range of services/activities); disaggregation by school, subject area, and format (virtual or in-person) Implementation Strategy A B C D E F G Adv. Course X X X X X X X SSS: Tutoring X X X X X (math) X (math) X SSS: Mentoring X X SSS: Counseling/Advising X SSS: Other Activities X (math) X (math) College Visit X X X X Job Site Visit X Student Events X X X X X X Parent Events X X X X X Teacher PD X X X X Community Partners X X X X Use Statewide Services X X X Total 4 6 5 5 8 7 11
  • 10. icfi.com | 10 Initial Analysis: Plans, Knowledge, and Perceptions ANALYSIS  Data Source: Student and parent surveys  Primary Analysis – Descriptive statistics (frequencies, averages, ranges) – Crosstabs (chi-square analyses comparing frequency distribution by subgroup) – Analysis of variance (ANOVA) comparing means by subgroup – Correlation  Key Baseline Takeaway: Both parent and student aspirations often exceeded expectations, suggesting they are concerned about being able to achieve their educational dreams.  Key Baseline Takeaway: Few students or parents perceive themselves as very knowledgeable, which can potentially be changed by participation in Texas GEAR UP SG.  Key Baseline Takeaway: Student overall satisfaction with Texas GEAR UP SG was highest at one school, where 41% of students indicated they were very satisfied.
  • 11. icfi.com | 11 Initial Analysis: Costs and Lessons Learned ANALYSIS  Cost – Data Source: Budgets and reported draw downs – Primary Analysis: Descriptive statistics, breakdown by cost categories  Facilitators and Barriers – Data Source: Survey and site visit data – Primary Analysis: Descriptive statistics, analysis of open-ended survey responses, qualitative analysis – Key Baseline Takeaway: Parents reported that engagement in activities is facilitated when topics are of interest to them, when events are held at times appropriate for their schedule, and when their student is also engaged.  Potentially Promising Practices – Data Source: Site visit data – Primary Analysis: Qualitative analysis – Key Baseline Takeaway: Early successes at some schools related to afterschool mathematics programs, enhanced college visits, and family events.
  • 12. icfi.com | 12 Forthcoming Analysis beyond Year 1 ANALYSIS  Level and Mix of Implementation: Analysis of various service factors – Provision type (virtual or on-line) – Frequency of delivery (number of hours, number of sessions) – Mix of services (e.g., enrollment in and tutoring in an advanced course) – Quality of implemented activities  Plans, Knowledge, and Perceptions: Disaggregation by student characteristics – Gender, race/ethnicity, LEP status, special education status – Participation in advanced coursework  Cost – Descriptive analysis of actual expenditures (annual and cumulative) by cost category  Types of Analysis – HLM (with student, school, and district levels) and cluster analysis – Impact analysis using extant outcome data – Comparisons using PSM – Linkages between outcomes and implementation – Change in implementation over time – Relationship of actual implementation and proposed plans
  • 13. icfi.com | 13 Lessons About This Evaluation from Year 1 LESSONS  Caution interpretation based on the period of data collection.  Use crosswalk to address 60+ evaluation questions.  Ensure common definitions of program services.  Consider ways to verify information across data sources.  Maximize the use of online surveys.  Leverage various strategies to obtain sufficient parent response rates.  Analyze data at multiple levels (school and district).  Utilize district-level case studies to understand the context in which implementation occurs.
  • 14. 14 O’Donnel, B., Briggs, A., Dervarics, D., Horwood, T., Sun, J., Alexander, A., Zumdahl, J., & Rhodes, J. (2013, September). Annual Implementation Report #1: Texas GEAR UP State Grant Evaluation. Report prepared for the Texas Education Agency by ICF International. Available online at: http://www.tea.state.tx.us/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=2576980765 9&libID=25769807662#25769807659 For more information, the report is publicly available:
  • 15. icfi.com | 15 Assessing the Fidelity of Implementation in the Diplomas Now Evaluation October 18, 2013 Presented to: American Evaluation Association 2013 Conference Felix Fernandez Aracelis Gray
  • 16. icfi.com | 16 Overview of Diplomas Now Study Diplomas Now is school turnaround model that unites three organizations – Talent Development, City Year, and Communities In Schools A random assignment design Sixty-two schools in 11 districts across the country participating in the study. Study will compare student outcomes in the 32 middle and high schools that implement DN to those in the 30 schools that do not.
  • 17. icfi.com | 17 Overview of Diplomas Now Implementation Study  Overall goal: document implementation in the 32 DN schools.  Research Questions: – How much variation in implementation fidelity was there across sites? – What were the largest challenges to implementing the DN model? – What were the most important ways in which the intervention as implemented differed from the intervention as planned?
  • 18. icfi.com | 18 DN Fidelity of Implementation  Fidelity of implementation is based on the DN Logic Model and measured by the Fidelity of Implementation Matrix.  The matrix is made up of 111 separate components, 62 of which were identified as critical to adequate implementation.  The fidelity matrix consist of 9 inputs ranging from program staff training, family and community involvement and student supports.
  • 19. icfi.com | 19 DN Fidelity of Implementation  That is… Input 1 Component X Component Y Component Z
  • 20. icfi.com | 20 DN Fidelity of Implementation  And… Overall Fidelity Input 1 Input 2 Input 3 Input 4 Input 5 Input 6 Input 7 Input 8 Input 9
  • 22. icfi.com | 22 Overview of Fidelity Matrix – Program Staff Training and Professional Development • 18 individual components, 15 of which are critical – Integrated On-Site Support (Critical Input) • 11 individual components, 9 of which are critical – Family and Community Involvement • 6 individual components, 1 of which is critical – Tiered Intervention Model (Critical Input) • 3 individual components, 2 of which are critical – Strong Learning Environments (Critical Input) • 6 individual components, 4 of which are critical – Includes 1 MS and 1 HS specific critical component
  • 23. icfi.com | 23 Overview of Fidelity Matrix – Professional Development and Peer Coaching (Critical Input) • 5 individual components, 2 of which are critical – Includes 1 HS specific component – Curriculum for College Readiness • 24 individual components, 4 of which are critical – Includes 7 MS and 17 HS specific components – Student Supports (Critical Input) • 24 individual items, 19 of which are critical – Student Case Management (Critical Input) • 14 individual items, 5 of which are critical
  • 24. icfi.com | 24 DN Fidelity of Implementation  Divided into two metrics, a categorical rating and a continuous score: 1. Implementation Rating (categorical measure): focus on critical components 2. Implementation Score (continuous measure): allow for assessment of greater variability between sites  Together they provide the flexibility to: – Look at categories that emerge – See if scores vary by rating, amount of overlap, etc. – Study relationships between implementation and outcomes
  • 25. icfi.com | 25 Data Sources  Fidelity of Implementation Data stem from the following sources: – Diplomas Now Implementation Support Team (DNIST) Survey – School Transformation Facilitator (STF) Survey – Citi Year Program Manager (CYPM) Survey – Communities In Schools (CIS) Site Coordinator (SC) Survey – Communities In Schools (CIS) Site Records
  • 26. icfi.com | 26 Fidelity of Implementation: Strong Learning Environments Component Operational definition Fidelity Scale Criterion Critical Sample Response Strong Learning Environments Small Learning Communities Teams of Teachers working with the same small group of students 0: No 1: Yes 1: Adequate/ High Fidelity YES 1:Yes Interdisciplinary Teams Frequency of Interdisciplinary team meeting 0: do not /rarely occur 1: occur monthly 2: occur bi-weekly 3: occur weekly 4:occur multiple times a week 5: occur daily 4: Adequate 5: High Fidelity YES 4:occur multiple times a week DN Site-Based Meeting Admin, STF, Program Manager, Site Coordinator hold brief review of program implementation (approx. 30 minutes) 0: Once a month or less 1: Biweekly 2: Weekly or more frequently 1=Adequate, 2= High Fidelity YES 0: Once a month or less DN Site-Based Collaboration Site based collaborative (Admin, STF, PM, Site Coordinator) have norms for collaboration, standards for communication, and frameworks for decision making 0: No 1: Partially/In Process 2: Yes 1=Adequate, 2= High Fidelity No 1: Partially/In Process 4x4 Block (High School Only Question) Four 75-90 minute class periods that meet daily 0: No 1: Hybrid/Acceptable Alternative 2: Yes 1= Adequate 2= Highly Fidelity YES 0: No
  • 27. icfi.com | 27 Implementation Rating The implementation rating focused on the “Critical to Fidelity/Adequate Rating” column within the fidelity matrix. Using this column each input (e.g., program staff professional development) of the DN model was provided with one of two ratings: 1. “Successful” - have met all components identified as critical 2. “Developing” - did not meet one or more critical components In addition to critical components, critical inputs have also been identified (i.e., inputs critical to an adequate implementation).
  • 28. icfi.com | 28 Implementation Rating  Individual input ratings served as the basis for the site-level fidelity rating, which has been broken up into four parts: 1. Low: successful on less than 3 critical inputs 2. Moderate: successful on at least 3 critical inputs 3. Solid: successful on at least 5 critical inputs 4. High: successful on 8 or more inputs including 5 critical inputs
  • 29. icfi.com | 29 Example: Implementation Rating  Implementation Rating only takes into account components identified as critical. In this case: – Teams of teachers working with the same small group of students – Frequency of interdisciplinary team meetings – DN site-based meeting – 4x4 Block  Given our sample responses this site has met the criterion for adequate implementation for teams of teachers and frequency of interdisciplinary team meetings but not DN site-based meetings or 4x4 classroom blocks.  It would therefore receive an implementation rating of “Developing” on this input.
  • 30. icfi.com | 30 Program Staff Professional Development DN Fidelity Implementation Rating Flowchart Successful Developing Integrated On-Site Support* Tiered Intervention Model* Professional Development and Peer Coaching* Student Supports* Student Case Management* Family and Community Involvement Strong Learning Environments* Curriculum for College Readiness Successful Successful Successful Successful Successful Successful Successful Successful Developing Developing Developing Developing Developing Developing Developing Solid Implementation (successful on 5 critical inputs) High Implementation (successful on 8+ inputs) ModerateImplementation (successfulonatleast3criticalinputs) LowImplementation (successfulonlessthan3criticalinputs) * indicates critical inputs Developing
  • 31. icfi.com | 31 Implementation Score The implementation score focused on the “Fidelity Scale” column within the fidelity matrix. Using this column each site received an input score, calculated as the equally weighted sum of the site’s fidelity scales divided by the total number of components. The average of the 9 individual input scores then formed the site-level implementation score.
  • 32. icfi.com | 32 Example: Implementation Score  Implementation scores are calculated by taking the sum of the weighted response divided by the total number of components. – Scale scores are equally weighted, for example, a component scaled 0-2 would be recoded 0=0, 1=.5, and 2=1  Adding up the weighted fidelity scale responses would equal 2.3 (1+.8+0+.5+0).  There are 5 Strong Learning Environments components.  The site’s implementation score for this input would then equal 2.3 divided 5 or .46.
  • 33. icfi.com | 33 Program Staff Professional Development Integrated On-Site Support Tiered Intervention Model Professional Development and Peer Coaching Student Supports Student Case Management Family and Community Involvement Strong Learning Environments Curriculum for College Readiness X / 18 Inputs Input Score (X / 18) / 9 X / 11 (X / 11) / 9 X / 6 (X / 6) / 9 X / 3 (X / 3) / 9 X / 5 (X / 5) / 9 X / 5 (X / 5) / 9 X / 17 (X / 17) / 9 X / 24 (X / 24) / 9 X / 14 (X / 14) / 9 Site-Level Score + + + + + + + + Note: X is the equally weighted sum of fidelity scale components. Sample calculations provided are only for HS data. DN Fidelity Implementation Score Flowchart
  • 34. icfi.com | 34 Fidelity of Implementation  Independently, each measure provides useful but different information.  Together, they provide flexibility in understanding implementation, allow for detailed discussion of site fidelity, and help to shape an implementation story.
  • 35. icfi.com | 35 Evaluating a Career Pathways Research Initiative PathTech: Successful Academic and Employment Pathways in Advanced Technology Kristen Peterson October 18, 2013 Presented to: American Evaluation Association 2013 Conference
  • 36. icfi.com | 36 The PathTech Program
  • 37. icfi.com | 37 Background on the PathTech Project  Funded through a grant from the National Science Foundation (NSF) under the Advanced Technological Education (ATE) Program  ATE promotes the improvement of education for science and engineering technicians entering high-technology fields  The ATE program supports many different types of activities including: – Articulation between two-year and four-year programs – Career pathways – Curriculum development – Educator professional development – General research advancing the understanding of educating technicians for careers in high-technology fields
  • 38. icfi.com | 38 Background on the PathTech Project  Successful Academic and Employment Pathways in Advanced Technologies (PathTech) – A research study examining the progression of students from high school into advanced technology programs and into the workforce – A four-year study currently entering the third year of the project  Collaborative grant awarded to the University of South Florida (USF) – Grant partnership includes USF researches, the Florida Advanced Technological Education Center (FLATE) and four south Florida Community Colleges • Hillsborough Community College • Polk State College • St. Petersburg College • State College of Florida
  • 39. icfi.com | 39 PathTech Research Design and Evaluation
  • 40. icfi.com | 40 PathTech Research Questions 1. Who enrolls in engineering technology (ET) programs out of high school? – How are student demographic and academic characteristics related to ET enrollment? – How do students learn about ET programs? – How can the pathway from high school into ET programs be improved? 2. How do ET students benefit from enrolling (in degree programs) and earning degrees through these programs? – What are the most critical steps in ET degree attainment from enrollment through gatekeeper courses and to the degree? – How do these students become ET program graduates? – How do the ET students differ from comparable students in their degree and employment outcomes?
  • 41. icfi.com | 41 Design Considerations for PathTech  Mixed-methods study  Quantitative Data and Analysis: – Descriptive statistics and empirical analysis with quantitative data from state databases  Qualitative Data and Analysis: – Ethnographic and qualitative analyses of engineering technology programs – Three data sources: • interviews with community college students, • Interviews with students at feeder high schools, • Interviews with local industry partners
  • 42. icfi.com | 42 Evaluation Approach  External evaluation of PathTech complements and supports the efforts of the PathTech research team and involves: 1. Monitoring the progress of various aspects of the project 2. Providing objective reviews of project instruments, plans, reports and other materials 3. Serving as an external resource for technical advice  Designed a flexible evaluation model to account for a dynamic work plan – Developed an annual workplan tracker – Update the workplan tracker monthly – Facilitate monthly status calls with the research team
  • 43. icfi.com | 43 PathTech Evaluation Challenges
  • 44. icfi.com | 44 Evaluation Challenges: Quantitative Data Analysis  Using data from the Florida Department of Education’s Florida Education and Training Placement Information Program (FETPIP) – Initial requests for data went unanswered – New policy to not release FETPIP employment data in conjunction with supplemental educational data that includes demographic data  Pursuing data from other sources including from the National Academy Foundation (NAF) – The NAF is a network of over 500 career-themed academies across the country which include engineering as a career theme. – The data would provide longitudinal, national, state and regional student-level data including data on academic performance, demographic characteristics and academy assessments.
  • 45. icfi.com | 45 Evaluation Challenges: Qualitative Data Collection  Multiple sources of qualitative data including high schools with a focus on engineering technology, community colleges and engineering technology industry partners – Initial pilot study at one community college, high school, and industry – Data collection and analyses are underway for community college and industry employers and recruiters – Challenge recruiting high schools with relevant engineering technology programs  Participant and stakeholder buy-in across sites – Particular challenge among area high schools
  • 46. icfi.com | 46 Evaluation Challenges: External Evaluation Considerations  Challenge: Monitoring progress with a flexible project design and dynamic work plan – Tracking changing timelines – Reporting on initial and updated benchmarks  Response: Task management and evaluation tools – Monthly progress meetings – Flexible workplan task tracker, designed to accommodate high-level and individual task changes
  • 48. icfi.com | 48 Current Project Status and Next Steps  Qualitative pilot studies have been completed with high school students, community college students, and industry partners – Pilot interviews were transcribed and thematically coded – Initial findings indicate three main factors influencing engineering technology pathways and several pathways leading into engineering technology fields  Interview protocols have been finalized and community college students have been recruited into the study  20 in-depth interviews with industry members were completed, and transcriptions and analysis are underway  Ongoing challenges obtaining quantitative data; pursing alternate data sources  Multiple publications in progress  Continue to monitor the research team’s progress and help overcome barriers
  • 49. icfi.com | 49 Contact Information Program Evaluation  Barbara O’Donnel, Principal, Barbara.ODonnel@icfi.com  T.J. Horwood, Senior Manager, Thomas.Horwood@icfi.com Texas GEAR UP State Grant  Ashley Briggs, Senior Associate, Ashley.Briggs@icfi.com  Chuck Dervarics, Expert Consultant, Chuck.Dervarics@icfi.com Diplomas Now  Felix Fernandez, Technical Specialist, Felix.Fernandez@icfi.com  Aracelis Gray, Senior Manager, Aracelis.Gray@icfi.com PathTech  Kristen Peterson, Senior Associate, Kristen.Peterson@icfi.com

Editor's Notes

  1. I was going to insert this into Ashley’s presentation too.
  2. I think we should take out T-STEM based on what was said on the last call.
  3. The first implementation report was primarily formative based on Year 1 implementation. Forthcoming analysis will look at outcomes, impact and expenditures as both district and statewide programs are rolled out. The evaluation design was developed based on conceptualizing how change is likely to occur as a result of the Texas GEAR UP SG through the development of a logic model. The logic model maps out the inputs (student, teacher/school and parent/community), program implementation activities, and intended outcomes of the program to be delivered. Family- knowledge and aspirations; school- pd participation; community- establishment of partnership
  4. Mixed method design to allow for checks and balances across multiple methods APR- Step-by-step guidance on how to enter student level data into excel sheet For this report, a limited number of student level variables were included in the prepopulated fields on the APR (i.e., gender, race/ethnicity, LEP status, special education status).
  5. SV topics varied by group; role-specific protocols includes different topic foci To best understand the role of various partners and progress at the state level, we conducted interviews with the Texas GEAR UP SG state project director at TEA and with appropriate personnel from each of the TEA Partners in May/June 2013 partner focus groups/interviews
  6. Analysis included a look at student level participation as well as number of hours in activities such as tutoring or number of events/workshops held. Other ways of slicing participation data for example enrollment in adv course included any course, # of courses, specific subject areas; mix of sss # of students receiving no, 1, 2 or 3+ sss; this shows the broad range of activities that students are involved in SV data helped to provide detailed understanding about how schools decided which students would receive sss and who provided those services with what funds; SV data was also helpful in verifying APR data- virtual and mentoring Mix It is unclear if the differences in level of implementation across schools is related to school perceptions of which SSS may be helpful to students or if it differs due to a need for schools to develop better strategies to identify students requiring SSS or to increase their capacity to provide the services to students. Initial analysis included some preliminary connections across sources; the school with high student satisfaction also implemented 11 elements to continue exploring once outcome data are available. While School G engaged in this broad range of activities, it was also relatively lower than several schools on both advanced course enrollment and student tutoring
  7. Subgroups included comparisons between schools and between parent and student Example of correlation- occurrence of discussions with GU staff and self-reported knowledge Knowledge- requirements and financial
  8. Draw downs may not reflect actual expenditures. That is, some school districts do not draw down grant funds on a regular basis and instead wait until the end of the grant period to draw down all funds spent. One area is the Year 1 budget and expenditures for the overall Texas GEAR UP SG as managed by TEA. A second area is the topline budget and expenditures of the four Texas GEAR UP SG school districts. Cost categories- payroll, professional services, supplies, other OCs, capital outlay Matching funds Analysis of open-ended items (e.g., needed supports) and creation of new codes- survey revisions Individual case studies
  9. Next AIR aug 2014 Subgroup analysis will include examining trends such as if students in advanced courses more or less likely than those who are not to be tutored in that subject. Parent participation will be examined relative to the student characteristics (e.g., students with special needs or in advanced courses more or less likely to have parents participating in GEAR UP events). ancova Cluster analysis will be conducted to identify groups of students participating in a given mix of activities/services. As outcomes become available, it will be of interest to understand whether specific implementation activities are associated with outcomes and/or if it is some level (amount) or mix of implementation that is related to outcomes. Year 1 implementation data was explored to begin to understand potential strategies for developing mix of implementation variables based on early patterns of mix of implementation at the school level. Use linkages to outcomes to identifying promising practices, beyond just what is reported as great Doc analysis using atlas ti to track district level implementation related to proposed plans While the 2012–13 Grade 7 cohort is the primary target for Texas GEAR UP SG implementation, it is hoped that future cohorts of students will also benefit through sustained implementation of the program with new Grade 7 students. Therefore, the evaluation team will compare outcome data from the follow-on cohorts as well.
  10. Imp began in Nov Dec 2012 and data were collected March-May the next year; Other ways that delayed implementation impact 1st year included PD and student schedules, staff changes in tea and district unable to change- need to consider in analysis Crosswalk included specific questions data sources and items to address them and the types of analysis to reach that goal; helped to ensure no stone left unturned Common definitions- adv courses, mentoring Online surveys- test for technical glitches- school firewalls Parent response rate as low as 6%
  11. We measure “components,” which we aggregate into “inputs”
  12. The nine “inputs,” when considered together, help us understand overall fidelity.
  13. A similar process would be conducted on the remaining eight inputs to provide a final count of the number of inputs successfully implemented, allowing then for calculation of a rating
  14. Funded through a grant from NSF Directorate for Education and Human Resources under the Advanced Technological Education (ATE) Program ATE promotes the improvement of education, particularly at 2-year community colleges for science and engineering technicians entering high-technology fields
  15. A research study designed to examine the progression of students from high school into advanced technology programs, specifically engineering technology programs at community colleges and into the workforce Specifically, the PathTech research study contributes to the overall goals of the ATE program by seeking to: Understand recruitment and pathways into engineering technology programs Improve the education of engineering technology programs Recommend interventions at high schools to increase the visibility of engineering technology programs at local community colleges Produce more qualified science and engineering technicians to meet workforce demands
  16. Planned to use data from the Florida Department of Education’s Florida Education and Training Placement Information Program (FETPIP) Initial requests for data went unanswered New policy to not release FETPIP employment data in conjunction with supplemental educational data that includes demographic data Presents a challenge to answering research questions about how demographic data is related to entry into ET programs and the ET industry as well as basic questions about the types of individuals in these careers and programs Additional efforts include partnering with the Community College Research Center through Columbia University to combine efforts to analyze state longitudinal data from community college technician education programs outside of Florida Challenge in using new data set with different variables and analytic possibilities. Can this data be used to analyze original research questions? Can analysis take place quick enough to make up for lost time in negotiating with FLDOE for FETPIP data? Currently, many unknowns, resulting in a shift to concentrate efforts in qualitative research during the first half of the project and hopefully concentrate on qualitative analysis in the remaining two years. However, qualitative analysis has not be free from challenge either.
  17. Changing timelines (i.e. initial plan emphasized quant in years 1-2, qual in years 3-4, now reversed) Monitoring validity and feasibility of proposed alternatives (with proposed alternative data sets) Reporting on initial and updated benchmarks (noting how updated data, updated timeline still answer initial research questions and meet proposed requirements)