Be A Champion, Inc.
Texas ACE Program Evaluation Report
21st
Century Community Learning Centers, Cycle 6 Year 4
NOGA ID Nu...
Waits Consulting i
Table of Contents
Executive Summary ......................................................................
Waits Consulting ii
e. Reasons for falling short of expectations.............................................................
Waits Consulting iii
Executive Summary
The purpose of this executive summary is to highlight the objectives, activities, r...
Waits Consulting iv
program activities seems vital. Additionally, conducting a “best time to attend” survey of parents
lik...
Waits Consulting 1
I. Evaluator Information
The Waits Consulting Group is an independent, interdisciplinary team comprised...
Waits Consulting 2
Services Administration. He also served as evaluator on the staff of the Graduate Medical
Education Nat...
Waits Consulting 3
attention to implementation. The principal purpose of the outcomes evaluation, on the hand,
was summati...
Waits Consulting 4
B of the Elementary and Secondary Education Act (ESEA), as amended by the No Child Left
Behind Act of 2...
Waits Consulting 5
Wunderlich Intermediate 73.9 41.5 15.2 Meets AYP Meets AYP None None None
Eiland Elementary 89.4 63.4 5...
Waits Consulting 6
Citations for this section.:
Enhancing Education (2013). The five-e‟s. Retrieved from
http://enhancinge...
Waits Consulting 7
Table 2: Out-of-School Time Program Major Objectives, Measurement of Objectives, and Data
Source
Progra...
Waits Consulting 8
Table 3: Status of Evaluation Outcomes
Program objective Status
Student academic improvement Met
Improv...
Waits Consulting 9
Total $1,211,250.00 $1,084,832.19 $564.52
b. Structure of Program
In Table 5 below the structure of the...
Waits Consulting 10
Staff Title Responsibilities Qualifications
Engage stakeholders and community members in 21
st
CCLC ac...
Waits Consulting 11
Aide attendees to learning and develop new grant
implementation techniques and hands-on
engaging activ...
Waits Consulting 12
shown in Appendix 1 of this report. (See also subsection e following immediately below.)The
following ...
Waits Consulting 13
Table 8: Matrix of Program Activities and Components Addressed
Activities
AcademicAssistance
Enrichmen...
Waits Consulting 14
Chart 1: Changes in reading grades from fall to spring
Chart 2: Reading grade changes for all State of...
Waits Consulting 15
combined proportion of 73% of BAC program participants either did not change in their
reading grade or...
Waits Consulting 16
Chart 4: Math grade changes for all State of Texas 21st Century participants
The data on changes in ma...
Waits Consulting 17
Rate
2012-13
2010--11
Eiland ES 98.1% 96.9%
Epps Island ES 98.4% 96.9%
Greenwood Forest ES 97.6% 96.9%...
Waits Consulting 18
Instrument/
Tool
Projected
% or #
Actual
% or #
1
Percentage of all 21st
Century
regular program parti...
Waits Consulting 19
For TEA Use Only
Adjustments and/or annotations
made
on this page have been confirmed
with
TEXAS EDUCA...
Waits Consulting 20
b. Additional program data collected.
In addition to the data collected and discussed above, surveys o...
Waits Consulting 21
staff take care of the
space the school
provides the program
j. I am satisfied
with the extent to whic...
Waits Consulting 22
2. In your judgment, to what
extent does the after school
program:
Very much Somewhat A little Not at ...
Waits Consulting 23
a. The after-school program uses school administrators (e.g., deans, assistant principals) to advise o...
Waits Consulting 24
6. What have been the primary benefits to the school of hosting the after school program?
(Circle all ...
Waits Consulting 25
8. In your judgment, what are the strengths of the after-school program?
(Circle all that apply.)
None...
Waits Consulting 26
c. Coordination with the school
d. Staff rapport with participants
e. Choice/diversity of activities
f...
Waits Consulting 27
Observations and interpretations of findings from the surveys of school principals:
(The readers is ag...
Waits Consulting 28
Total Student Vote 857 329 111 287 161
Percentage 49% 19% 6% 16% 9%
2. FAVORITE ACADEMIC
MATH
CLUB
REA...
Waits Consulting 29
Percentage 48% 52%
11. DO YOU PRACTICE MATH
OR SCIENCE WHEN
NOT IN AFTER-SCHOOL
PROGRAM OR AT SCHOOL? ...
Waits Consulting 30
5. DO YOU SEE IMPROVEMENTS IN YOUR CHILD'S HOMEWORK
COMPLETION AS A RESULT OF THE PROGRAM? YES NO N/A
...
Waits Consulting 31
The Waits Consulting Group evaluation team found that these three conditions were not easily
establish...
Waits Consulting 32
indicators, especially of outcomes that likely entail more errors in measurement, indicators that
go q...
Waits Consulting 33
Improved academic
performance
90% or more participating students will
develop a personal
development p...
Waits Consulting 34
Another noteworthy success was that of the many positives seen by the parents of
participants regardin...
Waits Consulting 35
VII. Next Steps
In this report section, program and other recommended changes are offered by the evalu...
Waits Consulting 36
behavior and are encouraged to continue them. Additional questions about changes in behavior
should be...
Waits Consulting 37
Appendix 1: Enrollment, Days Schedule, and Average Daily Attendance by Center and Activity
Center - Kl...
Waits Consulting 38
Center - Greenwood Forest ES
ACTIVITY
Students
Enrolled
Adults
Enrolled
Days
Scheduled
Days
Attended
S...
Waits Consulting 39
Strategic Learning 161 0 14 21 23 0
Study Lounge 161 0 69 63 131 0
Center - Nitsch ES
ACTIVITY
Student...
Waits Consulting 40
Total
Participants
Increase No Change Decrease
No Change
Necessary
Math 120 8 61 28 23
7% 51% 23% 19%
...
Waits Consulting 41
Total
Participants
Increase No Change Decrease
No Change
Necessary
Reading 129 19 49 35 26
15% 38% 27%...
Waits Consulting 42
Total
Participants
Increase No Change Decrease
No Change
Necessary
Math 93 8 44 24 17
9% 47% 26% 18%
M...
Upcoming SlideShare
Loading in …5
×

Be A Champion

391 views

Published on

Ace Conference 2013
Austin, Texas

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
391
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
6
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Be A Champion

  1. 1. Be A Champion, Inc. Texas ACE Program Evaluation Report 21st Century Community Learning Centers, Cycle 6 Year 4 NOGA ID Number126950147110007
  2. 2. Waits Consulting i Table of Contents Executive Summary ..................................................................................................... iii I. Evaluator Information ............................................................................................... 1 a. Scope of Work for Evaluator .......................................................................................... 2 b. Cost for Evaluator........................................................................................................... 3 II. Purpose of Program................................................................................................. 3 a. Description of the program needs and how the program addressed those needs. ... 4 Table 1. Description of Need .............................................................................................. 4 b. Philosophies employed and how this is disseminated to stakeholders. .................... 5 III. Program Objectives ................................................................................................ 6 a. Expectations .................................................................................................................. 6 Table 2: Out-of-School Time Program Major Objectives, Measurement of Objectives, and Data Source ....................................................................................................................... 7 b. Status of objectives....................................................................................................... 7 Table 3: Status of Evaluation Outcomes............................................................................. 8 IV. Program Description ............................................................................................. 8 a. Proposed budget and actual expenditures.................................................................. 8 Table 4: Budget and expenditures...................................................................................... 8 b. Structure of Program .................................................................................................... 9 Table 5: Program Operation and Structure......................................................................... 9 c. Staff including specific professional development opportunities, staff participation & expectations of performance as a result of the professional development ................. 9 Table 6: Out-of-School-Time Center Roles and Staffing..................................................... 9 Table 7 Program Staff Development..................................................................................10 d. Activities offered including levels of participation.......................................................11 e. How activities relate to program objectives. ..............................................................12 Table 8: Matrix of Program Activities and Components Addressed ...................................13 V. Outcomes ................................................................................................................ 13 a. Key Outcomes ..............................................................................................................13 Chart 1: Changes in reading grades from fall to spring......................................................14 Chart 2: Reading grade changes for all State of Texas 21st Century Participants ..............14 Chart 3. Changes in math grades from fall to spring..........................................................15 Table 9: Participant and campus attendance rates............................................................16 a. Additional program data collected for elements specified in the RFP......................17 Table 10: BAC, Cycle 6, Year 4 performance measures required by Federal and state granting authorities............................................................................................................17 b. Additional program data collected..............................................................................20 VI. Analysis. ................................................................................................................ 30 a. “How well did you do?”................................................................................................32 Table 13: Summary of outcomes evaluation findings.........................................................32 b. Discussion of program outcomes successes. ...........................................................33 c. Reasons for program successes.................................................................................34 d. Results falling short of expectations ..........................................................................34
  3. 3. Waits Consulting ii e. Reasons for falling short of expectations...................................................................34 VII. Next Steps ............................................................................................................ 35 Appendix 1: Enrollment, Days Schedule, and Average Daily Attendance by Center and Activity...............................................................................................................................37 Appendix 2: Changes in Reading Grades and Math Grades during the cycle on the part of participants........................................................................................................................39
  4. 4. Waits Consulting iii Executive Summary The purpose of this executive summary is to highlight the objectives, activities, results and next steps reviewed in the following pages of this program evaluation report. Be A Champion, Inc., identified 5 program objectives along with performance measures that would be focused on in year 4 to determine program success. The objectives were to: Improve Academics Improve Attendance Improve Behavior Improve Promotion Rates Improve Graduation Rates During site visits and review of data we noted areas in program operations where improvements could be made. These items are discussed in the key findings and recommendations section below. Key Findings and Recommendations Finding 1: In order to make annual evaluations more useful to enhancing the out-of- school-time program, the outcomes evaluation portion of this project needs to be improved. Improvements to the Texas 21st Century databases are especially needed. Recommendation 1:The outcomes evaluation design of this project would benefit considerably by incorporating more pretest measures, a comparison or control group, and more antecedent variables into the data collection and analysis. Incorporating more pre-test measures would help to eliminate possible selection biases that potentially over- or underestimate the true impact of the ACE program. A comparison or control group would also help to eliminate such selection biases but also other internal validity problems possibly arising from such likely error sources as history, testing, and maturation. Furthermore, including more antecedent variables into the data collection and analysis would help to eliminate further the possibility that reported program outcomes are spurious owing to prior variables simultaneously influencing both out-of-school-time participation and project results. To that end, the project director should meet early on with the Waits Consulting Group evaluation team to work out the specifics of this recommendation. Finding 2:Parent involvement in the program was rather low, especially relative to parents‟ program enrollment across the centers. Recommendation 2: The program needs to elicit more active participation on the part of parents. Improved communications with parents with a stress on the importance of attending
  5. 5. Waits Consulting iv program activities seems vital. Additionally, conducting a “best time to attend” survey of parents likely would help to identify and circumvent scheduling difficulties. Finding 3:It is more difficult to measure improvement in students‟ behavior than in the other ACE objectives. Recommendation 3:More difficult constructs, such as students‟ behavior, and changes over time in students‟ behavior, require more indicators for reliable and valid measurement. Accordingly, indicators that go beyond required state and Federal performance measures, (i.e., especially discipline referrals) need to be adopted. In particular in future funding cycles such additional measures as student conduct grades, teacher survey measures, and parental survey questions should also be considered and required of after school programs. Such additional measures will facilitate the more careful statistical analysis and investigation of program outcomes regarding students‟ behavior. This program (BAC) utilized parent surveys about behavior and are encouraged to continue them. Additional questions about changes in behavior should be included in future parent surveys. Finding 4:The program‟s objective of improved promotion rates was not supported by the evidence. Recommendation 4: A lack of evidence was a part of the difficulty confronting the Waits Consulting Group evaluation team in assessing the program‟s success regarding students‟ promotion rates. In the future, promotion rates need to be investigated with more data. Particularly needed is more and better data available in the Texas 21st databases, data that will allow for a more rigorous analysis with more sophisticated statistical models.
  6. 6. Waits Consulting 1 I. Evaluator Information The Waits Consulting Group is an independent, interdisciplinary team comprised of former and current public school administrators, higher education administrators, and business professionals with over 100 years of combined experience in the public and private sector. Members of the team have advanced degrees, including Ph.D., Ed.D.,M.B.A., and M.Ed. degrees. In the pursuit of these degrees and in other advanced studies, various members have completed coursework in program evaluation, statistics, statistical sampling, qualitative research methods, evaluation design, and other program assessment techniques. Further, the Waits Consulting Group has had considerable prior experience in evaluating Afterschool Centers on Education (ACE) administered by the Texas Education Agency for the Federally-funded 21st Century Community Learning Center (CCLC) grants program. In particular, the Group has completed successfully evaluations of the following after-school centers during the noted funding cycles and program years: HISD Cycle 5, year 3 HISD Cycle 5, year 4 HISD Cycle 5, year 5 HISD Cycle 6, year 2 HISD Cycle 6, year 3 HISD Cycle 6, year 4 HISD Cycle 7, year 1 Be A Champion, Inc. Cycle 6, year 2 Be A Champion, Inc. Cycle 6 year 3 Be A Champion, Inc. Cycle 6, year 4 Texas Serenity Academy Cycle 7, year 2 Principal evaluator for this program: Roger Durand, Ph.D. Roger Durand holds a Ph.D. awarded with Distinction (University of California – Berkeley and Los Angeles campuses) and has completed post-doctoral studies in survey research, research design, and statistical modeling at the Institute for Social Research at the University of Michigan as well as in the formal modeling of change at a National Science Foundation Program conducted at Virginia Tech. In addition to being a member of the Waits Consulting Group, he also serves as Professor of Public Affairs at the University of Houston-Clear Lake where his teaching has emphasized the instructing of graduate courses in program evaluation. Previously he taught evaluation courses and coursework at UCLA, the University of Missouri, the University of Colorado, and the University of Houston – Central Campus. as Senior Analyst in the Division of Evaluation, U.S. Department of Health, Education, and Welfare, and later the U.S. Department of Health and Human Services, in Washington, D.C. In that capacity he directed evaluations of the National Health Service Besides his academic career, Dr. Durand has served Corps Scholarship Program, the Community Mental Health Services Program, and educational programs of the Bureau of Health Professions of the Health
  7. 7. Waits Consulting 2 Services Administration. He also served as evaluator on the staff of the Graduate Medical Education National Advisory Committee and as evaluation advisor to the White House Domestic Policy Counsel. Throughout his career Dr. Durand has been involved in more than 75 program evaluations serving as Principal Investigator, Co-Principal Investigator, or External Evaluator on projects funded by the U.S. Public Health Service, the U.S. Department of Labor, the U.S. Environmental Protection Agency, the U.S. National Aeronautics and Space Administration, the America‟s Promise Foundation, the Hogg Foundation for Mental Health, the Texas Coordinating Board for Higher Education, and the United Way. Since 2006 Dr. Durand has served as the evaluator of the Houston‟s Kids Out-of-School-Time Program, a collaborative effort of the Alief Independent School District, the Children‟s Museum of Houston, Communities in Schools of Houston, the YMCA of Greater Houston, the United Way of Greater Houston, and the America‟s Promise Foundation. Previous to that he served as the evaluator of “HELP for Kids,” a four-year project funded by a grant from the U.S. Department of Health and Human Services to the Greater Houston Collaborative for Children. More recently, he served on the advisory board of the Neighborhood Centers, Inc., a non-profit agency, which successfully devised a plan to gather baseline and other performance evidence regarding elementary education in the Gulfton Neighborhood of Houston. And he has served as evaluator for the Out-of-School-Time Agencies Affinity Group of the United Way of Greater Houston. A member of the American Evaluation Association, Dr. Durand has made numerous presentations at its annual evaluation research conferences and presently is a reviewer of evaluation paper proposals for the 2013 international meetings. The author or co-author of more than 200 peer-reviewed publications, research papers, monographs and book chapters, Dr. Durand‟s evaluation work has appeared in a variety of professional venues including Social Forces, the Social Science Quarterly, the American Journal of Public Health, the Journal of Mathematical Sociology, Health Policy and Education, Sociology and Social Research, the Urban Affairs Quarterly, and the Journal of Organization Culture, Communications, and Conflict. a. Scope of Work for Evaluator The Afterschool Centers on Education (ACE) is the program administered by the Texas Education Agency for the Federally-funded 21st Century Community Learning Center (CCLC) grants authorized under Title IV, Part B of the Elementary and Secondary Education Act (ESEA), hereafter referred to as the “Act,” as amended by the No Child Left Behind Act of 2001 (NCLB; Public Law 107-110). Under Section 4205, paragraph b (2) of ESEA as amended, a periodic evaluation of ACE is required. The purpose of such an evaluation is to refine, improve, and strengthen the program and to refine performance measures. Pursuant to the provisions of the Act, the Waits Consulting Group completed both a process and an outcomes evaluation of the Cycle 6, Year 4 ACE oftheBe a Champion, Inc., program (hereafter referred to as “BAC”). The principal purpose of the process evaluation was formative in nature. That is, the main aim was program development and improvement with particular
  8. 8. Waits Consulting 3 attention to implementation. The principal purpose of the outcomes evaluation, on the hand, was summative: to assess the program‟s effectiveness in producing desired outcomes. In completing these evaluations, the Waits Consulting Group conducted all of the following activities -- Reviewed the program‟s grant application, including stated community needs; the manner of addressing those stated needs; targeted population groups; the program‟s expectations; the involvement of stakeholders; and resources available to the program. Met with program site coordinators to further understanding of the specific strategies used to achieve program objectives. Developed a plan for evaluating the process or implementation of the afterschool program; the program‟s outputs; and the outcomes of the program in light of program, State, and Federal government CCLC/ACE objectives. Coordinated the collection and the quality monitoring of all data included in this report. Conducted site visits to the learning center covered in this report. Collected and analyzed survey data on the program from school principals. Analyzed student and parent survey evidence. Gathered all quantitative and qualitative data on the program included in this document. Collected campus-level data not reported in the 21st Century Community Learning Centers databases. Protected and insured the confidentiality of data. Systematically analyzed all assessment data utilizing appropriate computer software and statistical models. Made recommendations to guide performance improvement and program sustainability. Prepared a final evaluation report. b. Cost for Evaluator The cost for the program evaluation services rendered above is $20,000. II. Purpose of Program As noted briefly in the preceding section of this report, the Afterschool Centers on Education (ACE) is the program administered through the Texas Education Agency for the Federally- funded 21st Century Community Learning Center (CCLC) grants authorized under Title IV, Part
  9. 9. Waits Consulting 4 B of the Elementary and Secondary Education Act (ESEA), as amended by the No Child Left Behind Act of 2001 (NCLB; Public Law 107-110). The principal purposes of Texas ACE programs are to enhance academic performance, increase school attendance, improve student behavior, positively affect grade promotion rates, and to increase student graduation rates. Be a Champion, Inc. (BAC) has developed and implemented a Cycle 6 set of learning centers, now in year 4, intended to achieve these purposes. It has done so by means of providing a before- school and after-school program for students on nine (9) campuses in the Klein Independent School District. a. Description of the program needs and how the program addressed those needs. In order to identify community and school campus needs, BAC conducted a comprehensive needs assessment. Included in the assessment were data derived surveys of parents, meetings with teachers, discussions with campus principals, conversations with school district officials, and results from a study conducted by the After School Alliance. The assessment also included an analysis of demographic, at-risk, English language proficiency, school improvement, and test score data provided by the Klein Independent School District. The needs assessment revealed that slightly over 77% of students on the nine campuses served by BAC were from economically disadvantaged backgrounds. It also showed that a large number of these students were left unsupervised as a result of the hours worked by their parents. Utilizing a study conducted by the After School Alliance, BAC noted an association between “latchkey” student status and lower achievement, as well as a greater propensity of these students to participate in juvenile crime. Additionally, teachers included in the needs assessment expressed concern about a lack of homework completion while campus discussions indicated that students did not have the opportunity to participate in specialty classes. Further, surveys revealed the need for job preparation assistance as well as financial awareness and home health education on the part of parents. Analysis of data provided by the Klein Independent School District as part of the needs assessment, data shown in Table 1 immediately below, demonstrated further evidence of need. As indicated in the table, all of the nine campuses served by BAC had percentages of students lower in socioeconomic status and higher in being classified as “at risk” than either the Texas statewide average or the average for the Klein Independent School District as a whole. Additionally, four BAC Cycle 6 schools “missed” AYP in reading and four missed AYP in math. Moreover, one of the BAC Cycle 6 schools, Klein Intermediate, was in Stage 1 for School Improvement (SIP) in both reading and math. Table 1. Description of Need Percent Annual Yearly Progress Low Percent Percent Status School Improvement Program Center SES At Risk LEP Reading Math Requirements Klein Intermediate 83.6 54.0 21.0 Missed AYP Missed AYP Stage 1 Reading Math
  10. 10. Waits Consulting 5 Wunderlich Intermediate 73.9 41.5 15.2 Meets AYP Meets AYP None None None Eiland Elementary 89.4 63.4 51.9 Meets AYP Meets AYP None None None Epps Island Elementary 90.0 73.6 55.7 Missed AYP Meets AYP None None None Greenwood Forest Elementary 64.9 43.4 29.0 Meets AYP Meets AYP None None None Kaiser Elementary 89.1 71.6 53.3 Missed AYP Missed AYP None None None Klenk Elementary 69.0 53.9 37.2 Meets AYP Meets AYP None None None McDougle Elementary 84.7 66.0 49.2 Meets AYP Meets AYP None None None Nitsch Elementary 86.6 54.4 27.6 Missed AYP Missed AYP None None None Klein ISD 42.0 36.9 12.8 Missed AYP Missed AYP Stage 1 Reading Math State of Texas 60.4 45.4 16.8 N/A N/A N/A N/A N/A Source: Texas Education Agency, Academic Excellence Indicator System, 2011-2012 Campus Performance Source: Texas Education Agency, Adequate Yearly Progress Campus Data Table, Final 2012 AYP Results Following the assessment, the BAC staff along with District and campus officials designed a program intended to address the identified needs. The program that was designed provided both a before-school and after-school program of academic assistance and enrichment for students as well as one intended to meet parents‟ needs. A “study lounge” was devised for student academic enrichment while students were provided course offerings in sports, dance, art, music, drama, culinary arts, STEM, college readiness and life skills. Moreover, each student was provided a development plan to track their success in the out of school time program. Further enhancing the academic experience for students, a partnership was developed with the University of Houston to provide college campus visits and mentoring. Finally, parents were offered a “Parent University” that featured speakers, activities, and workshops related to the needs revealed by the BAC assessment. (Further description of specific activities and the program objectives such activities were intended to serve are shown in Table 8 below of this report as well as in the report appendix.) b. Philosophies employed and how this is disseminated to stakeholders. The philosophy employed by BAC was premised on the Nine High-Yield Strategies developed by Dr. RobertJ.Marzano (see Marzano, 2009; Zanesville, 2002; Teachscape, 2013). The program‟s instructional staff utilized the philosophy and its associated methodology in the student activities that were provided in the BAC program. This philosophy and methodology was further incorporated with the “Five E model” of Engage, Explore, Explain, Elaborate and Evaluate (see Enhancing Education,2013) into the program. The program, including its philosophy and its results, are disseminated to stakeholders by means of the Klein Independent School District Web site, school board updates as well as through emails and monthly newsletters communicated in several languages. Additionally, BAC collaborates closely with the District‟s liaison to insure the effective dissemination of information about the program.
  11. 11. Waits Consulting 6 Citations for this section.: Enhancing Education (2013). The five-e‟s. Retrieved from http://enhancinged.wgbh.org/research/eeeee.html Marzano, RJ (2009). Setting the record straight on„ high–yield‟ strategies.‟ Retrieved from http://www.marzanoresearch.com/documents/Marzano9-09.pdf Teachscape (2013) Teachscape‟s high yield strategies. Retrieved from http://www.scholastic.com/teachers/article/teachscapes-high-yield-strategies Zanesville (2002). Retrieved from http://www.zanesville.k12.oh.us/cms/lib02/OH16000170/Centricity/Domain/56/DOCUMENTS/M arzanoHighYieldStategies.pdf III. Program Objectives a. Expectations The principal objectives of the BAC Cycle 6 out of school time in Year 4included each of the following: Student Academic Improvement. Ninety percent (90%) or more of participating students will develop a personal development plan (PDP) that is reviewed and maintained by students and program staff. This PDP links the school day to afterschool and will result in increased academic performance. Improve Student Attendance. Adult family members of at least 50% of all participating students will participate in literacy, educational and/or college readiness, workforce development, or financial education activities. This will lead to support and increased attendance. Improve Student Behavior. Ten percent (10%) or fewer of the total number of students enrolled at each campus will be cited for criminal or non-criminal activities. Improve Student Promotion Rates. Ninety percent (90%) of student participants will participate in college awareness, career exploration, and science /technology activities. Improve Student Graduation Rates. Ninety percent (90%) of the total number of students enrolled at each campus will receive a grade promotion at the by spring 2011. In Table 2 immediately below, the measures utilized in this evaluation of progress together with their respective data sources are summarized according to program objective.
  12. 12. Waits Consulting 7 Table 2: Out-of-School Time Program Major Objectives, Measurement of Objectives, and Data Source Program objective Measurement Data Source Student academic improvement Increase in math and reading grades from fall to spring Texas21st database Improve student attendance Student Attendance Rates Texas21st data base Improve student promotion rates Student Promotion and Retention Rates Texas21st database Improve student behavior Number of discipline referrals Texas21st database A multiple measures, multiple comparison groups evaluation design was utilized to measure progress in achieving program objectives. All data were analyzed using the Statistical Package for the Social Sciences (SPSS) with statistical models appropriate to the level of measurement (nominal, ordinal, interval). *(Important note: Evaluators continue to debate the use of “propensity scores” in analyzing program outcomes by means of quasi-experimental designs and participant-nonparticipant comparisons (or low- versus high-level of participation comparisons) of the kind utilized in this report. Such propensity scores and comparisons have been advanced as means of avoiding or reducing selection biases in measuring program effects. See, for example, Holmes, W.M. (2013). Using propensity scores in quasi-experimental designs. Thousand Oaks, CA: Sage. Yet, the use of such scores entails a number of problems. First, such scores can never fully eliminate selection biases, but can only reduce the effects of some potential biases. Second, their use does not avoid the ceteris paribus assumption underlying all evaluation designs and results. Third, the use of such scores requires a large number of observations on individual participants and non-participants (or infrequent participants) alike – observations not available in this evaluation. Finally, selection itself may often be considered a part of a program. This was true in the present case in which a needs assessment, the interest of enrolled students, and parents‟ desires determined program strategies and activities. Accordingly, propensity scores are not utilized in the analysis that follows.) b. Status of objectives Principal findings from the evaluation of outcomes are presented in Table 3 immediately below. Further details will be found in Section V (below) in this report.
  13. 13. Waits Consulting 8 Table 3: Status of Evaluation Outcomes Program objective Status Student academic improvement Met Improve student attendance Met Improve student promotion rates Not met Improve student behavior Met IV. Program Description In this report section, a detailed description of the BAC Cycle 6, Year 4 program will be provided. Since this program is a multi-site one, most of the data reported in this section will be site-specific in nature thus affording inter-site comparisons of the program. To assist the reader‟s understanding of the large volume of data we report, a rather heavy reliance is placed on tabular presentations. a. Proposed budget and actual expenditures Table 4: Budget and expenditures Category Projected Actual Cost per Regular Participant Payroll $863,322.00 $733,278.58 $381.72 Professional and Contracted Services $309,428.00 $310,584.89 $161.67 Supplies and Materials $30,000.00 $33,054.87 $17.02 Other Operating Costs $3,500.00 $2,998.63 $1.56 Equipment $5000.00 $4915.25 $2.55
  14. 14. Waits Consulting 9 Total $1,211,250.00 $1,084,832.19 $564.52 b. Structure of Program In Table 5 below the structure of the program, including hours per week, number of weeks, and total program enrollment are reported. As will be noted, the spring term was longer in weeks than the fall term and had a larger total enrollment. Hours for the present summer term are also shown. Table 5: Program Operation and Structure Academic Term Hours Per Week # of Weeks Per Term Total Enrollment Fall (2012) 15 12 1182 Spring (2013) 15 23 1923 Summer (2013) 20 4 Source:Texas21st c. Staff including specific professional development opportunities, staff participation & expectations of performance as a result of the professional development In general, the expectations were that all staff hired for the program possess the qualifications and experience necessary to provide a high quality, safe and secure program of academic enrichment and services for participants. Further, It was also expected that staff would discharge the responsibilities entrusted to them in a timely and effective manner and observe at all times professional behavior. In Table 6 immediately below, the roles and staffing of this out- of-school-time program are presented. Table 6: Out-of-School-Time Center Roles and Staffing Staff Title Responsibilities Qualifications Project Director The Project Director is responsible for multi-site management of the 21 st Century Community Learning Center program in Klein ISD in accordance with the mission and policies of BE A CHAMPION, INC. Recruit , interview, train, mentor, supervise, evaluate and monitor staff according to guidelines and maintain a positive work environment for staff that includes recognition/rewards Develop a staff retention plan in accordance with administration Actively cultivate additional community based support and partnerships for the program to sustain program after funding lapses Attends national, state and regional 21 st CCLC meetings Seek additional funding opportunities to support 21 st Century programs and activities Bachelor‟s degree from an accredited institution and minimum of 3 years experience working with youth Understanding and knowledge of children, their behavior and development and ability to use this knowledge as a resource with staff with specific knowledge in the development of age appropriate curriculum activities and use of appropriate behavior
  15. 15. Waits Consulting 10 Staff Title Responsibilities Qualifications Engage stakeholders and community members in 21 st CCLC activities and disseminate evaluations Provide support to partner organizations and subcontractors to ensure effective programming Ensure that the site maintains and utilizes current, updated attendance rosters on a daily basis Maintain open communication with Klein ISD personnel in regards to daily operations and concerns, and curriculum needs Communicate effectively with parents, staff and school personnel through various means Promote the 21 st CCLC program in the schools and in the community Address concerns and demands of parents, subcontractors, and school administration regarding difficult situations and troubleshoot with administration toward effective outcomes management techniques and ability to transfer this knowledge to staff Organizational skills relating to handling documents and related paperwork Supervisory and administrative skills including supervision of site staff and effective administration of the program at the various sites Ability to provide leadership in accordance with Be A Champion, Inc.‟s mission with staff Site Coordinators Recruits students and parents for program activities Participates in professional development in core contents, program of studies, reading, math and science best practices Requests supplies for program, maintains program inventory Coordinates with program partners to offer enrichment and recreational activities Gathers student academic and non-cognitive and parent data for evaluation Manages all program data collection, record keeping, evaluations and staff time keeping and submits electric daily reports Coordinates with local food services to provide snacks for students A four year degree or work experience is required Demonstrations strong organizational skills, be goal oriented and have the ability to multitask Demonstrates ability to work effectively with a variety of people, faculty and staff, students and families and agencies Demonstrates ability to coordinate training and professional development to staff Source: Klein ISD Table 7 Program Staff Development Training Offered/Attended Attendee Type Description of Training 1) TX 21st Century Conference PD & 2 Site Coordinators PD and SC‟s each, attended 7 out of 50 sessions (for 21 total) to provide insight and ideas on project management, budgeting, activities and classroom management. 2) BOOST Educational Out-of-school time networking conference for
  16. 16. Waits Consulting 11 Aide attendees to learning and develop new grant implementation techniques and hands-on engaging activities for students. 3) Foundations Educational Aide Beyond school time conference for networking, to share ideas and learn the most up-to-date tools for out of school time programming. 4) NAA’s Educational Aide To share ideas, best practices, explore new topics, and help develop high quality learning opportunities for all participants. 5) Family Engagement Networking Meeting Educational Aide/FES Reviewed and collaborated on ideas and best practices for family engagement activities. 6) CASE Training - Mr. Happy Front-line staff Classroom instructors received training on classroom management techniques and activity ideas from Mr. Happy. 7) Project Director Webinars PD Provided with project updates, grant requirements and important notices for 21st Century. 8) Project Director Retreat PD → SC went in place of PD Focused on sustainability for grant reductions. 9) Cultural Competency Educational Aide/FES Cultural awareness and understanding training about the impact of cultural differences among families and in the workplace. This training provided the necessary foundation and tools to become culturally competent and provided a better understanding of the cultural challenges facing multicultural workplaces & families. 10) Communication Networking Meeting PD Region 4 meeting to discuss grant and project updates and notices. 11) TAC Networking Meeting (Hosted by Paula Ware) PD Meeting with Houston area grantees in Paula Ware‟s region. Meeting to discuss grant and project updates and notices as well as best practices Paula selected as effective management tools. d. Activities offered including levels of participation A complete, detailed breakdown of the program‟s activities, including enrollments, days scheduled, and average daily attendance for students and parents by after-school center, is
  17. 17. Waits Consulting 12 shown in Appendix 1 of this report. (See also subsection e following immediately below.)The following salient observations were derived from a close inspection of the evidence presented in Appendix 1: The number and type of student activities varied considerably from center to center as expected since activities were determined by community need as well as by interest. Also as expected, there was generally, though not always, variation in student enrollment in activities at each Center. For example, at Greenwood Forest Elementary School student enrollments ranged from a low of 26 in “123s x ABCs” to a high of 130 in several activities (e.g., Motions for Dance, Sports V and TechKnow Kids). On the other hand, at other Centers, Kaiser Elementary School is an example, student enrollments were reported as constant across all activities. In general average daily student attendance was found to vary considerably across activities at each Center. At Nitsch Elementary School, for example, average daily attendance among students reportedly ranged from a low of five (5) for the culinary activity to a high of 73 for homework assistance. At all of the Centers parents were found to be active in only one activity, generally termed “Parent University,” but also listed as “Parent Program” or “Parent Night” in two of the Centers. Parent enrollment across the Centers averaged 28.89 with a standard deviation (or “variability‟) of 13.05. On the other hand, the average daily attendance of parents in their enrolled activity was found to be 10.33 with a standard deviation of 8.06 around that average. Thus, average daily attendance across the Centers on the part of parents was only about 36% of those who were enrolled and were much more uniform as well as low compared to the variability of parent enrollment (as judged by comparing the standard deviations of enrollment and attendance). e. How activities relate to program objectives. The Be A Champion, Inc. program offers numerous activities for students to choose from. Listed in Table 8 below is a sample of course offerings. A more detailed list of activities offered by each campus is shown in the Appendix.
  18. 18. Waits Consulting 13 Table 8: Matrix of Program Activities and Components Addressed Activities AcademicAssistance Enrichment FamilyandParental SupportServices Collelgeand WrokforceReadiness Academic Assistance x Homework Help x Parent University x Excel at Sports x Parent Night x Soccer Club x Praise Through Dance x Arts Exploration x Good Eats x Notions of Dance S.T.E.M. Is In x V. Outcomes In this report section the outcomes of the BAC cycle 7 ACE program for year 4 are presented. Initially, the “key outcomes,” those related most closely to the objectives of the program, are discussed. Then findings concerning other outcomes are shown. a. Key Outcomes In Charts 1 and 3 immediately following, changes in reading and math grades by the end of the cycle for all BAC centers combined are displayed. In Charts 2 and 4, comparable grade changes are shown for all State of Texas 21st Century participants during the same, current academic year. These latter charts have been included to provide a comparison “yardstick” by which to assess BAC grade outcomes.
  19. 19. Waits Consulting 14 Chart 1: Changes in reading grades from fall to spring Chart 2: Reading grade changes for all State of Texas 21 st Century Participants As seen in Charts 1 and 2 above, more BAC and State participants remained the same with regarding to their fall to spring grades than showed either an increase or a decrease. This is all the more true when the proportions of those for whom no change was necessary are combined with the proportions of those who did not change. Thus, a 11% 55% 16% 18% BAC Year End Reading Grades Cycle Summary Increase No Change Decrease No Change Needed 20% 43% 21% 16% Increase No Change Decrease No Change Necessary
  20. 20. Waits Consulting 15 combined proportion of 73% of BAC program participants either did not change in their reading grade or did not need to do so while a combined proportion of 59% were either unchanged or did not need to change among State participants. These results are consistent with an interpretation of an afterschool “maintenance effect” in which the program reinforced or maintained reading grades for a majority of participants. Of course, other interpretations and explanations are possible. (On this point, see Finding 1 and Recommendation 1 of this report.) Yet, as already noted above, nothing in the data available to this evaluation report contravenes the argument that grade maintenance or reinforcement was the predominant program effect. Finally, the evidence further shows that BAC had a smaller proportion of reading grade increases compare to State participants as a whole, but also a smaller proportion of grade decreases. Chart 3. Changes in math grades from fall to spring 14% 50% 18% 18% BAC Year End Math Grades Cycle Summary Increase No Change Decrease No Change Needed
  21. 21. Waits Consulting 16 Chart 4: Math grade changes for all State of Texas 21st Century participants The data on changes in math grades shown in Charts 3 and 4 immediately above tended to mirror those for reading grade changes. As was true of reading grades, most participants remained unchanged in their math grades from fall to spring, especially when the categories of “no change” and “no change necessary” were combined. Moreover, this was so for BAC as a whole as well as statewide. Once again, these results are consistent with an interpretation of program reinforcement or maintenance effects. Finally, BAC participants showed proportionately fewer math grade increases than was true of state participants as whole, but about the same percentage of grade decreases. Further details of reading and math grade changes by individual BAC center are contained in Appendix 1 of this report. The most salient observation to be made from comparing grade changes across individual centers within the BAC ACE program is the considerable variation that was found. Without additional data, it is difficult to know whether this intercampus variation was a result of differences in program quality, in “student mix” and background characteristics, or from some other source (e.g., parental values). In Table 9 immediately below the AY2012-13 attendance rates for ACE participants by ACE site are shown along with attendance rates for their respective campuses as a whole for AY 2010- 11. (These latter data for the campuses as a whole are the most recent comparative data available.) Table 9: Participant and campus attendance rates Campus Participant Attendance Campus Attendance Rate 22% 44% 18% 16% Increase No Change Decrease No Change Necessary
  22. 22. Waits Consulting 17 Rate 2012-13 2010--11 Eiland ES 98.1% 96.9% Epps Island ES 98.4% 96.9% Greenwood Forest ES 97.6% 96.9% Kaiser Int. 98.6% 97.0% Klein Int. 98.6% 95.2% Klenk ES 98.9% 97.4% McDougle ES 98.3% 97.6% Nitsch ES 98.0% 97.1% Wunderlich ES 98.0% 96.4% As is event from the table, the ACE participant attendance rate exceeded that for the overall campus‟ rates. (The rate for participants across sites was 98.28% while that for the campuses as a whole was 96.82%.) Moreover, taken individually, at each campus the attendance rate for ACE participants exceeded that for the entire campus. These data are consistent with a conclusion of an attendance improvement attributable to the afterschool program. (Of course, other interpretations and conclusions are possible. Yet, nothing in the data available for this report contravenes the conclusion of a positive ACE program effect.) Note: Neither the Klein Independent School District nor TEA reports promotion rates or discipline referrals. Thus, such data could not be included and discussed in this evaluation report. a. Additional program data collected for elements specified in the RFP. Additional data called for in the RFP in the form of evidence on required Federal and state performance measures are displayed in Table 10 immediately below. Table 10: BAC, Cycle 6, Year 4 performance measures required by Federal and state granting authorities. For TEA Use Only Adjustments and/or annotations made on this page have been confirmed with TEXAS EDUCATION AGENCY Standard Application System (SAS) School Year 2012-2013 County-District No. or Vendor ID. by telephone/e-mail/FAX on by of TEA. Amendment No. Texas 21st Century Community Learning Centers, Cycle 6, Year 4 Schedule # 4C—Performance Assessment and Evaluation Part 2: Performance Targets # Performance Measure Assessment Year 4 Performance
  23. 23. Waits Consulting 18 Instrument/ Tool Projected % or # Actual % or # 1 Percentage of all 21st Century regular program participants whose mathematics and English grades improved from Fall to Spring. Report Card grades from Personal Development Plan 20% 26.1 % in English and 26.3% in mathematics 2 Percentage of all 21st Century regular program participants who improve from not proficient to proficient or above in TAKS reading and TAKS mathematics. Student results abstracted into database 65% Data not available at this time 3 Percentage of all 21st Century regular program participants with teacher-reported improvement in homework completion and class participation. Teacher surveys 65% Data not available 4 Percentage of all 21st Century regular program participants with teacher-reported improvements in student behavior. Teacher surveys 75% Data not available 5 Percentage of all 21st Century regular program participants showing improvement in school day attendance. Campus records 70% 71% 6 Percentage of students in K- 11 that promote to the next grade as of the end of the school year. Student’s final report cards 70% 58% 7 Percentage of 11th and/or 12th grade high school students that graduate at the end of the school year/ summer. Student’s final report cards n/a n/a 8 Percentage of all 21st Century program participants that attend 30 hours or more of programming per term. Attendance Logs 90% 76% As of April 2012 9 Percentage of all 21st Century program participants involved in extracurricular school activities. Note: this includes U.I.L. sponsored activities as well as after-school sponsored activities. Campus records 50% 4% 10 Percentage of all 21st Century program participants whose activity selection is based on a needs assessment. Student Development Plans& Surveys 100% 100% 11 Percentage of 21st Century Site Coordinators who implement strategies learned as a result of trainings attended. (Based on Project Director assessments). Site Visit Logs 100% 100%
  24. 24. Waits Consulting 19 For TEA Use Only Adjustments and/or annotations made on this page have been confirmed with TEXAS EDUCATION AGENCY Standard Application System (SAS) School Year 2012-2013 County-District No. or Vendor ID. by telephone/e-mail/FAX on by of TEA. Amendment No. Texas 21st Century Community Learning Centers, Cycle 6, Year 4 Schedule # 4C—Performance Assessment and Evaluation Part 2: Performance Targets # Performance Measure Assessment Instrument/ Tool Year 4 Performance Projected % or # Actual % or # 12 Total number of innovative instructional activities offered. Program Schedules 35% 62 13 Total number of students meeting with an assigned adult advocate (does not include school day teachers or staff). Attendance Logs 20% 6% 14 Total number of parent meetings held by the Site Coordinator. Signed Logs 65 86 15 Total number of school day staff meetings held by the Site Coordinator. Meeting Minutes 30 18 16 Total number of pre- and post- test assessments conducted. Completed assessments N/A N/A 17 Total number of staff members receiving training. Sign-in sheets 60 80 The above data are self-reports of performance provided to the Waits Consulting Group evaluation team by BAC, Cycle 6, Year 4 out-of-school time officials. As will be noted, some of the data in Table 10 are self-reported outcomes (e.g., The percentage of all 21st Century regular program participants whose mathematics and English grades improved from fall to spring while others); others are “outputs” (e.g., Percentage of all 21st Century program participants that attend 30 hours or more of programming per term.); while still others may be considered as “process” or “implementation” measures (e.g., The total number of innovative instruction activities offered). These separate types of performance measures (outcomes, outputs, process) have all been included here in this report in order to keep together all of those measures required by state and Federal granting authorities. Some of the program performance results reported in the above table appears to be quite positive. For example, 100% of the activity selections of 21st Century participants were reportedly based on a needs assessment. On the other hand several of the reported performance results suggest the need for improvement. Improvements seem especially needed with regard to the percentage of students who were promoted to the next grade (58%). Other percentages in the above table (e.g., grade improvements in reading and math) are difficult to assess in the absence of true baseline or “pre-program” numbers.
  25. 25. Waits Consulting 20 b. Additional program data collected. In addition to the data collected and discussed above, surveys of school principals were also collected. In total only seven (7) surveys were completed and returned. Accordingly, the reader is warned to exercise caution in interpreting these surveys which are shown in numbers (rather than in percentages) in the tables below. Be a Champion Survey Results from Principals, 2013 1. Do you agree or disagree with the following statements about the relationship between your school and the after-school program administrated by Be A Champion, Inc? Strongly Agree Agree Disagree Strongly Disagree Not Sure a. There is a strong partnership between the after-school program and my school 3 3 1 b. The after school program keeps me informed of important decisions and issues 3 3 1 c. Teachers in my school are willing to collaborate with the after-school program staff 2 5 d. After-school program staff are responsive to my ideas and suggestions 3 3 1 e. After school staff reach out to teachers in the school to identify the needs of students 2 3 1 1 f. After -school staff are responsive to the teachers in the school 3 3 1 g. School staff are encouraged to visit the program 1 5 1 h. After-school staff transmit important information about children and parents to me and my staff in a timely fashion 2 4 1 i. After-school 2 4 1
  26. 26. Waits Consulting 21 staff take care of the space the school provides the program j. I am satisfied with the extent to which the after school program involves me in decisions about program operations 2 4 1 k. Students are properly supervised by after school program staff 3 2 1 1 l. Curriculum and instruction in the after- school program reinforce concepts being taught during the school day 1 5 1 m. The after- school program provides students with learning opportunities not available during the regular school day 0 5 1 1 n. The after- school program has enough capacity to serve all interested students 1 4 1 1 o. The after- school program is well coordinated with other after-school programs at the school 3 2 1 1 Note: Table entries are the number of survey respondents
  27. 27. Waits Consulting 22 2. In your judgment, to what extent does the after school program: Very much Somewhat A little Not at all a. Enhance the overall effectiveness of the school? 5 1 1 b. Enhance students' motivation to learn? 1 5 1 c. Contribute to improved student skills in reading? 1 5 1 d. Contribute to improved student skills in math? 1 5 1 e. Enhance students' attitudes toward school? 3 3 1 f. Improve students' safety? 5 1 1 g. Improve student attendance? 3 2 2 h. Reduce vandalism at the school? 2 4 1 i. Increase parents' attendance at school events? 1 4 1 1 j. Increase parents' attendance at parent teacher conferences? 1 3 1 2 3.What are the most effective strategies used by the after-school program to complement or augment curriculum, content, and/or activities from the regular school day? (Circle all that apply.) The after-school program is not aligned or coordinated with the regular school program in any way (Skip to the Next Question) 3
  28. 28. Waits Consulting 23 a. The after-school program uses school administrators (e.g., deans, assistant principals) to advise or monitor activities 3 b. After-school staff coordinate homework assistance with classroom teachers 3 c. The after-school program adopts school themes for special projects 2 d. The after-school program solicits input from the principal and teachers on skills in which students need help, and incorporates these topics into after-school activities 5 e. The coordinator of the after-school program serves on a school planning team 0 f. Other (Specify) 0 _________________________________________________________________________________ _________________________________________________________________________________ 4. How often do you visit the after-school program? a. Never b. Less than once a semester c. 1-2 times a semester 2 d. About once a month e. A few times a month 2 f. At least once a week 3 5. Is there a school-wide mechanism or strategy in place for communication between school day teachers and after-school staff about homework assignments? a. Yes 3 b. No 4 {continued below}
  29. 29. Waits Consulting 24 6. What have been the primary benefits to the school of hosting the after school program? (Circle all that apply.) None- I do not think there are any major benefits to the school (Skip to the next question) 1 a. Parents express more positive feelings about the school because it provides a safe place for their children after school. 5 b. Students receive additional opportunities to develop literacy skills 2 c. Students have opportunities to participate in activities not available during the regular school day. 5 d. Teachers from the school have opportunities to work with children outside the classroom 2 e. Other (Specify) 7. What have been the primary drawbacks to the school of hosting the after-school program? (Circle all that apply.) 1 did not answer this question. None-I do not think there are any major drawbacks to hosting the program (skip to next question) 5 a. The amount of school staff time required for program coordination b. Increased concerns for students safety getting home from the program c. Sharing classroom and other school facilities with another organization 1 d. Other (Specify) 1 Before school program staff is consistently late.
  30. 30. Waits Consulting 25 8. In your judgment, what are the strengths of the after-school program? (Circle all that apply.) None - I do not think this is a very strong program (Skip to next question) 1 a. The qualifications and leadership of the coordinator 6 b. Staff rapport with participants 6 c. Choice/diversity of activities d. Quality of homework help e. Quality of academic enrichment activities 2 f. Coordination/integration with school curriculum g. Connections to the community 5 h. Connections with parents 5 i. Other (Specify) 1 Before and after school care. 9. In your judgment, what aspects of the program need attention? (Circle all that apply.) 1 survey respondent did not answer this question. None - I think the program is fine as it is (Skip to next question) 2 a. The qualifications of site coordinators 1 b. The qualifications of other program staff
  31. 31. Waits Consulting 26 c. Coordination with the school d. Staff rapport with participants e. Choice/diversity of activities f. Quality of homework help g. Quality of academic enrichment activities 2 h. Quality of other activities i. Coordination/integration with school curriculum 1 j. Connections to the community 1 k. Connections with parents 1 l. Number of staff 2 m. Staff turnover 1 n. Other (Specify) _________________________________________________________ ______________________________________________________________________ _____________________________________________________________________ 10. Compared to last year, to what extent would you say that… To a great extent To some extent A little Not at all a. There is improved communication and interaction this 1 1 2 1 year among after-school and school day staff? b. There is improved collaboration and problem solving this 4 2 year between after-school and school-day staff? c. School day staff are more aware this year of the services 1 2 2 and resources offered by the after-school program? d. The entire school community feels an increased sense of 1 2 2 safety this year because of the program? e. The school invests more resources in the after-school 1 1 2 program this year?
  32. 32. Waits Consulting 27 Observations and interpretations of findings from the surveys of school principals: (The readers is again cautioned to accept these results with circumspection since they are based on small number of cases.) The following are observations and interpretations ofthe above survey responses: One survey respondent was almost completely negative in all of his/her responses. This was especially noticeable in answers to nearly all parts of question 1, all parts of question 2, and to all parts of question 10. This seems to be quite an across-the-board, disgruntled individual. Question 1 is generally about the program‟s implementation. With the exception of the across-the-board, disgruntled respondent, the responses were generally rather positive. In the responses to question 2 and to question 6, questions that were about program outcomes, the responses were (with the noted exception) generally positive. A majority of respondents either perceived improvement over last year‟s program or saw no need for improvement (see responses to question 10). Few saw drawbacks to BAC in hosting the after school learning center (Question 7). In general, many more cited strengths of the program than reported matters in need of attention (compare the responses to questions 8 and 9). The program aspects cited as being in need of attention were quite varied with no single matter standing out. In addition to the above survey data gathered from principals, surveys of students were also conducted. In total responses were obtained from 1,745 students enrolled in the BAC ACE from all nine (9) of the campuses participating in the BAC Cycle 6, Year 4, program. The number of student-completed interviews by individual campus learning center (school) is immediately shown below: Schools Completed surveys KI 178 WUN 169 EILAND 158 EPPS 200 GWF 160 KAI 296 KLENK 208 MCDOUGLE 175 NITSCH 201 The table below shows the results by survey question obtained from all participating students. Table 11: Results of student participant surveys 1. FAVORITE ENRICHMENT HEALTH & FITNESS FINE ARTS STRATEGIC LEARNING COOKING & NUTRITION TRAVEL WRITERS
  33. 33. Waits Consulting 28 Total Student Vote 857 329 111 287 161 Percentage 49% 19% 6% 16% 9% 2. FAVORITE ACADEMIC MATH CLUB READIN G CLUB S.T.E.M HOMEWORK TECHNOLOG Y Total Student Vote 163 174 937 143 328 Percentage 9% 10% 54% 8% 19% 3. FEELINGS ABOUT AFTERSCHOOL EXCITED HAPPY BORED UNHAPPY Total Student Vote 592 810 192 151 Percentage 34% 46% 11% 9% 4. ACTIVITIES LEAST LIKED HEALTH & FITNESS FINE ARTS STRATEGIC LEARNING COOKING & NUTRITION TRAVEL WRITERS Total Student Vote 89 82 198 103 89 Percentage 5% 5% 11% 6% 5% MATH CLUB READIN G CLUB S.T.E.M HOMEWORK TECHNOLGY Total Student Vote 210 196 79 654 45 Percentage 12% 11% 5% 37% 3% 5. DO YOU FEEL SAFE? YES NO Total Student Vote 1588 157 Percentage 91% 9% 6. SEE DONE DIFFERENTLY? (PICK 2) MORE SNACKS MORE SPORTS MORE ARTS & CRAFTS LONGER CLASS TIME Total Student Vote 989 423 135 90 Percentage 57% 24% 8% 5% MORE HW HELP LESS SPORTS LESS ARTS & CRAFTS MORE BOOKS Total Student Vote 63 0 45 0 Percentage 4% 0% 3% 0% 7. NEED HELP W/HW EVERYDAY? YES NO Total Student Vote 1502 243 Percentage 86% 14% 8. DO YOU PLAY SPORTS WHEN NOT IN AFTERSCHOOL PROGRAM OR AT SCHOOL? YES NO Total Student Vote 959 785 Percentage 55% 45% 9. DO YOU HAVE ART ACTIVITIES WHEN NOT IN AFTERSCHOOL PROGRAM OR AT SCHOOL? YES NO Total Student Vote 799 946 Percentage 46% 54% 10. DO YOU READ BOOKS WHEN NOT IN AFTERSCHOOL PROGRAM OR AT SCHOOL? YES NO Total Student Vote 837 908
  34. 34. Waits Consulting 29 Percentage 48% 52% 11. DO YOU PRACTICE MATH OR SCIENCE WHEN NOT IN AFTER-SCHOOL PROGRAM OR AT SCHOOL? YES NO Total Student Vote 943 802 Percentage 54% 46% Observation and interpretations of student survey results. The following are the principal observations and interpretations of the students‟ survey results: Health and fitness was the most popular student enrichment activity while “strategic learning” was the least popular. S.T.E.M. (Science, technology, engineering and math) was selected as the most popular academic activity by students while “homework” ranked at the bottom. Homework was also chosen as the “least liked” overall activity. Only one in five after school participants reported being either unhappy or bored with the program. Fully 91% reported feeling “safe” in the program. “More snacks” was the number one recommendation for things to be done differently. When not participating in the after school program or at school, students tended to report spending their time either active in sports or practicing math or science. Besides principals and students, surveys of parents were also conducted. Completed survey responses were obtained from a total of 281 parents whose children were enrolled in the BAC, Cycle 6, Year 4, program. Survey results are shown immediately below. Table 12: Survey responses from parents of participants. 1. DO YOU FEEL YOUR CHILD(REN) ARE SAFE IN THE PROGRAM? YES NO Total Adult Vote 281 0 Percentage 100% 0% 2. DO YOU FEEL THE PROGRAM PROVIDES YOUR CHILDREN WITH NEW OPPORTUNITIES? YES NO Total Adult Vote 235 46 Percentage 84% 16% 3. DO YOU SEE IMPROVEMENTS IN YOUR CHILD'S BEHAVIOR AS A RESULT OF THE PROGRAM? YES NO N/A Total Adult Vote 197 40 44 Percentage 70% 14% 16% 4. DO YOU SEE IMPROVEMENTS IN YOUR CHILD'S GRADES AS A RESULT OF THE PROGRAM? YES NO N/A Total Adult Vote 137 46 98 Percentage 49% 16% 35%
  35. 35. Waits Consulting 30 5. DO YOU SEE IMPROVEMENTS IN YOUR CHILD'S HOMEWORK COMPLETION AS A RESULT OF THE PROGRAM? YES NO N/A Total Adult Vote 168 31 82 Percentage 60% 11% 29% 6. DO YOU THINK THE PICK-UP PROCESS IS EFFECTIVE AND SAFE? YES NO Total Adult Vote 266 15 Percentage 95% 5% Observation and interpretations of parent survey results. The following are the principal observations and interpretations of parents‟ survey results: In general (with possibly a single exception) parents provided quite positive answers to each of the survey questions, indicating considerable satisfaction with the program. The one possible exception was that of seeing improvements in grades as a result of program participation. This possible exception, however, was most likely attributable to parents not seeing their students‟ grades at the time of survey completion. Thus, some 35% of responding parents provided no answer (N/A) to this question. All of the parents surveyed (100%) reportedly felt that their child(ren) was (were) safe in the program while 95% expressed the feeling that the pick-up process was effective and safe. Some 84% of parents expressed the view that the ACE provided their children with new opportunities. Examples cited included learning how to read, write, being more interested in learning, art, dance, improved social skills, college prep, enhancing sport skills and teaching responsibility. About 70% of parents reported observing improvements in their child‟s behavior as a result of participating in the program. Finally, about 60% reportedly observed improvements in homework completion. VI. Analysis. In this report section an analysis of the program‟s outcomes is presented. Following an initial caveat, the principal program outcomes are summarized. Then the program‟s successes are described after which those areas where the program fell short of expectations are discussed. An important caveat. To establish that a program has “caused” or resulted in an outcome, three conditions are necessary. First, the program must proceed in point of time the expected or hypothesized outcome. Second, the program must be associated with an expected outcome. Third and finally, the program and the outcome must not be spuriously related. The classic case of a spurious relationship is that ice cream eating and murders in New York City are associated with each other. Of course, it is not that ice cream eating causes murders; both are related to and result from an antecedent condition, the heat of the summer. It is worth noting further that spuriousness can lead one to falsely conclude either that a program is having an outcome effect or that it is not.
  36. 36. Waits Consulting 31 The Waits Consulting Group evaluation team found that these three conditions were not easily established. This is because the data called for and available through the Texas 21st Century Community Learning Centers program databases were not adequate to the task. First, there is need for more and better pre-test measures to be incorporated in the Texas 21st databases. Absent such measures, the conditions of time ordering (condition 1 above) and of an association between the program (or participation in the program) and expected outcomes (condition 2) were at best difficult to establish. Second, there is need to incorporate many more antecedent variables or conditions into existing databases in order to avoid possible spurious results (condition 3). For example, parents‟ desires for their children to exhibit better behavior in schools is one such antecedent condition. In at least one out-of school-time evaluation, Durand (2013) found that such desires, as documented by pre-enrollment surveys, were a source of spuriousness. Such parental desires caused parents‟ to enroll their children in the program and also caused their children to exhibit better behavior. Thus, the effect was not an after-school-time program one, it was a “parental desires” impact. Third, there is need for better, randomized “control” or comparison group data to be incorporated into Texas 21st Century databases. Spurious program outcomes can arise from many sources including selection biases, history, maturation, testing, and the like. Such multiple, possible sources of spuriousness are best dealt with through the “gold standard” of evaluation design: a pretest-post-test control group one, especially one in which control subjects have been randomly chosen. (For reasons that will not be further discussed here, a simple comparison of program participants and non-participants will generally not suffice to effectively eliminate spuriousness. See Blalock, H.M., Causal Inference in Non-experimental Research. Chapel Hill: University of North Carolina Press, 1964; and Holmes, W.M., Using Propensity Scores in Quasi-Experimental Designs. Thousand Oaks, CA: Sage, 2013). Finally, the three conditions for causal inference aside, there is yet another problem with establishing program outcomes that involves the existing Texas 21st Century databases. That problem is one of outcomes measurement itself. It is, after all, considerably more difficult to measure some outcomes than others. For example, accurately measuring changes in students‟ behavior is more difficult than is measuring changes in, say, school day attendance. More difficult-to-measure changes and constructs require more indicators for reliable and valid measurement to allow for “measurement triangulation,” the use of multiple indicators of the same construct, the long-recognized solution to reducing random and systematic measurement error (see Siegel, P.M. and Hodge, R.W. (1968). A causal approach to measurement error. In Blalock, H.M. and Blalock, A.B., (eds.) Methodology in Social Research. New York: McGraw- Hill). The importance of reducing random and systematic measurement errors in program evaluation cannot be overstressed. Coleman (Coleman J.S., The mathematical study of change, in Ibid.) has demonstrated how such measurement errors, especially random ones, can result in grossly misleading inferences and conclusions regarding changes in dependent variables, including outcomes. In brief, the Texas 21st Century databases need to include more multiple
  37. 37. Waits Consulting 32 indicators, especially of outcomes that likely entail more errors in measurement, indicators that go quite far beyond existing Federal performance measures. Given these needs for better available evidence from Texas 21st Century databases and the conditions described immediately above, the reader of this report is strongly urged to draw inferences about program outcomes with considerable circumspection. a. “How well did you do?” In the table immediately below a summary evaluation of the program‟s outcomes is presented. The summary presented in the table is derived from the evidence discussed above in Section V. (“Outcomes”) of this report and, thus, is based upon the findings of the Waits Consulting Group evaluation team. Table 13: Summary of outcomes evaluation findings Objective: Outcome: Evaluation: Improved academic performance Charts 1 through 4 above on grade changes and results of parents‟ surveys support a conclusion of improved academic performance. The evidence is consistent with a conclusion that this objective was met. Improved attendance The data in Table 9 show improved school day attendance. Surveys of principals supported improved attendance. The evidence is consistent with a conclusion that this objective was met. Improved behavior Results from parents‟ surveys and indirectly those from principal surveys suggest that this objective was met. The evidence is consistent with a conclusion that this objective was met. Improved promotion rates Promotion data were not reported and, hence, not available. Reports from BAC in Table 10 suggest that the target was not met. The available evidence indicates that this objective was not met. For comparison and discussion purposes, the “objective ratings” self-reports of the BAC program staff are included in the table immediately below. Objective: Strategy: Evaluation: strategy for meeting Objective Active Rating Comments
  38. 38. Waits Consulting 33 Improved academic performance 90% or more participating students will develop a personal development plan that is reviewed and maintained by students and program staff. This PDP links the school day to afterschool and will result in increased academic performance. Met the stated objective Improved attendance Adult family members of at least 50% of all participating students will participate in literacy, educational and/or college readiness, workforce development, or financial education activities. This will lead to support and increased attendance. Did not meet, but progressed toward the stated objective Improved behavior 10% or fewer of the total number of students enrolled at each campus will be cited for criminal or non- criminal activities. This will be accomplished through a focus on character education classes and team building opportunities. Did not meet, but progressed toward the stated objective Improved promotion rates 90% of student participants will participate in college awareness, career exploration, and science / technology activities. This will show students their potential for the future leading to increase dedication towards graduation Met the stated objective Improved graduation rates 90% of the total number of students enrolled at each campus will receive a grade promotion at the by spring 2011. This will be accomplished by linking to the school day through PDP development& regular reviews of campus plans Did not meet, but progressed toward the stated objective It is difficult to compare the evaluations of the Waits Consulting Group evaluation team with the “objective ratings” self-reports of BAC. Notice that the BAC self-reports were judged against targets based on strategies and, often, on outputs or process indicators. The Waits Consulting Group evaluation was based only on outcome indicators. Furthermore, the Waits evaluation team did not consider graduation rates since no high schools were included in BAC Cycle 6, Year 4. Finally, no data were available to the Waits team regarding criminal or non-criminal disciplinary actions. Given the above considerations, it is still interesting to note that the Watts team and BAC were both in agreement about the meeting of the “academic improvement” objective. On the other hand, the Watts team found that the objectives of improved attendance and behavior were met while BAC officials did not. Conversely, the Watts team concluded that the objective of improved promotion rates was not met, while BAC concluded the opposite. b. Discussion of program outcomes successes. As already discussed in the preceding section, the program was successful in improving academic performance and, it is believed by the Watts team, also successful in meeting the objectives of improved attendance and behavior.
  39. 39. Waits Consulting 34 Another noteworthy success was that of the many positives seen by the parents of participants regarding the program itself as well as in the improved behavior of their child(ren). c. Reasons for program successes In the professional judgment of the Waits Consulting Group evaluation team, the program‟s successes resulted principally from the following – 1. A strong dedication and commitment on the part of BAC officials and BAC ACE program staff. 2. An effective strategy and set of activities that were based closely on the program‟s thorough and comprehensive needs assessment. 3. A close integration of the regular school day with the after-school and summer ACE program. 4. The program‟s embracing the “Five E model” of Engage, Explore, Explain, Elaborate and Evaluate. 5. A close collaboration between the program‟s evaluation team from the Waits Consulting group and BAC officials; such collaboration allowed for multiple forms of formative information sharing. d. Results falling short of expectations While the Waits Consulting Group evaluation team finds itself in some disagreement with the BAC self-reports, it found no evidence that promotion rates were improved as a result of the program. e. Reasons for falling short of expectations A lack of evidence concerning promotion rates was a part of the difficulty confronting the Watts Consulting Group evaluation team. In the future, promotion rates need to be investigated with more data and through a more rigorous analysis of data that will hopefully be improved in the Texas 21st databases. Another reason why improvements in promotion rates were not greater and expectations were not fully met quite likely involved the social circumstances and backgrounds of students enrolled in the program. The low socioeconomic and high at-risk status (see Table 1 of this report) of students participating in the program were extremely difficult barriers to overcome. Despite the reasons cited above as reasons for program success, the challenges to success deriving from poverty and low incomes were palpable.
  40. 40. Waits Consulting 35 VII. Next Steps In this report section, program and other recommended changes are offered by the evaluator. These recommended changes are intended to help improve the program in the future as well as to enhance the usefulness of future out-of-school-time program evaluations. For ease and clarity of presentation, the specific finding from this evaluation that gave rise to a particular recommendation is also included. Finding 1: In order to make annual evaluations more useful to enhancing the out-of-school- time program, the outcomes evaluation portion of this project needs to be improved. Improvements to the Texas 21st Century databases are especially needed. Recommendation 1:The outcomes evaluation design of this project would benefit considerably by incorporating more pretest measures, a comparison or control group, and more antecedent variables into the data collection and analysis. Incorporating more pre-test measures would help to eliminate possible selection biases that potentially over- or underestimate the true impact of the ACE program. A comparison or control group would also help to eliminate such selection biases but also other internal validity problems possibly arising from such likely error sources as history, testing, and maturation. Furthermore, including more antecedent variables into the data collection and analysis would help to eliminate further the possibility that reported program outcomes are spurious owing to prior variables simultaneously influencing both out-of-school-time participation and project results. To that end, the project director should meet early on with the Waits Consulting Group evaluation team to work out the specifics of this recommendation. Finding 2:Parent involvement in the program was rather low, especially relative to parents‟ program enrollment across the centers. Recommendation 2: The program needs to elicit more active participation on the part of parents. Improved communications with parents with a stress on the importance of attending program activities seems vital. Additionally, conducting a “best time to attend” survey of parents likely would help to identify and circumvent scheduling difficulties. Finding 3:It is more difficult to measure improvement in students‟ behavior than in the other ACE objectives. Recommendation 3:More difficult constructs, such as students‟ behavior, and changes over time in students‟ behavior, require more indicators for reliable and valid measurement. Accordingly, indicators that go beyond required state and Federal performance measures, (i.e., especially discipline referrals) need to be adopted. In particular in future funding cycles such additional measures as student conduct grades, teacher survey measures, and parental survey questions should also be considered and required of after school programs. Such additional measures will facilitate the more careful statistical analysis and investigation of program outcomes regarding students‟ behavior. This program (BAC) utilized parent surveys about
  41. 41. Waits Consulting 36 behavior and are encouraged to continue them. Additional questions about changes in behavior should be included in future parent surveys. Finding 4:The program‟s objective of improved promotion rates was not supported by the evidence. Recommendation 4: A lack of evidence was a part of the difficulty confronting the Waits Consulting Group evaluation team in assessing the program‟s success regarding students‟ promotion rates. In the future, promotion rates need to be investigated with more data. Particularly needed is more and better data available in the Texas 21st databases, data that will allow for a more rigorous analysis with more sophisticated statistical models.
  42. 42. Waits Consulting 37 Appendix 1: Enrollment, Days Schedule, and Average Daily Attendance by Center and Activity Center - Klein Intermediate ACTIVITY Students Enrolled Adults Enrolled Days Scheduled Days Attended Student ADA Adult ADA Academic Assistance 103 0 69 63 5 0 Adventures thru Academics 103 0 28 25 11 0 Homework Help 103 0 69 64 53 0 Parent University 0 16 6 7 0 10 S.T.E.M 103 0 41 39 11 0 Sports (1) 103 0 41 39 5 0 Sports (2) 103 0 41 40 11 0 Step (1) 38 0 28 25 4 0 Center - Wunderlich Intermediate ACTIVITY Students Enrolled Adults Enrolled Days Scheduled Days Attended Student ADA Adult ADA Academic Assistance 134 0 69 63 60 0 Excel at Sports 53 0 69 63 23 0 Motions of Dance 9 0 20 15 5 0 Parent Night 0 19 3 8 0 5 Praise Through Dance 6 0 5 4 4 0 Soccer Club 67 0 41 37 32 0 Center - Eiland ES ACTIVITY Students Enrolled Adults Enrolled Days Scheduled Days Attended Student ADA Adult ADA AM HWassist 39 0 69 63 17 0 Arts Exploration 118 0 69 63 11 0 Excel at Sports 117 0 69 63 28 0 Good Eats 117 0 36 33 15 0 Heros 117 0 60 54 9 0 HWassist 119 0 69 63 84 0 Motions of Dance 119 0 69 63 16 0 Parent University 0 51 1 7 0 12 S.T.E.M Is In 117 0 69 63 12 0 Strategic Minds 41 0 13 12 15 0 Center - Epps Island ES ACTIVITY Students Enrolled Adults Enrolled Days Scheduled Days Attended Student ADA Adult ADA BOLT 197 0 69 63 41 0 Creations 197 0 69 63 32 0 Homework Assistance 197 0 69 63 105 0 Leer Creer (Read to believe) 197 0 45 39 8 0 Parent University 0 21 3 7 0 8 Rise and Shine 37 0 69 63 11 0 S.T.E.M. 197 0 69 63 18 0
  43. 43. Waits Consulting 38 Center - Greenwood Forest ES ACTIVITY Students Enrolled Adults Enrolled Days Scheduled Days Attended Student ADA Adult ADA 123s x ABCs 26 0 69 63 15 0 Arts Exploration 130 0 27 24 15 0 Character Education 122 0 24 24 21 0 Excel at Sports 120 0 24 24 43 0 Excel at Sports V 130 0 45 39 45 0 Homework Assistance 120 0 24 24 86 0 Homework Zone 117 0 45 39 89 0 Motions of Dance 130 0 45 39 14 0 Parent Program 129 35 6 7 0 7 S.T.E.M. 120 0 24 24 23 0 TechKnow Kids 130 0 45 39 20 0 Center - Kaiser ES ACTIVITY Students Enrolled Adults Enrolled Days Scheduled Days Attended Student ADA Adult ADA Arts Exploration 108 0 69 63 10 0 Chapter Chat 108 0 69 63 10 0 Excel at Sports 108 0 69 63 10 0 HWassist 108 0 69 63 71 0 Motions of Dance 108 0 69 63 8 0 Parent University 0 18 2 9 0 5 Rhymes 'n' Times 108 0 69 63 53 0 S.T.E.M Is In 108 0 69 63 8 0 Strategic Learning 108 0 69 63 9 0 Center - Klenk ES ACTIVITY Students Enrolled Adults Enrolled Days Scheduled Days Attended Student ADA Adult ADA Academic Aide 96 0 69 63 61 0 Arts Exploration 88 0 4 4 10 0 Excel at Sports 96 0 69 63 28 0 Let's Cook! 96 0 69 63 27 0 Parent University 0 37 4 7 0 31 Special Projects 96 0 69 63 26 0 Strategic Learning 96 0 69 63 27 0 Center - McDougle ES ACTIVITY Students Enrolled Adults Enrolled Days Scheduled Days Attended Student ADA Adult ADA AM HW assist 24 0 69 64 15 0 Arts Exploration 161 0 55 52 15 0 Decipher Club 161 0 13 19 16 0 Excel at Sports 161 0 56 58 28 0 Geek Squad Club 161 0 56 56 21 0 Heros 161 0 13 24 25 0 Parent University 0 44 3 7 0 8 S.T.E.M 161 0 13 17 8 0
  44. 44. Waits Consulting 39 Strategic Learning 161 0 14 21 23 0 Study Lounge 161 0 69 63 131 0 Center - Nitsch ES ACTIVITY Students Enrolled Adults Enrolled Days Scheduled Days Attended Student ADA Adult ADA Arts Exploration 107 0 41 38 12 0 Computer Technology 107 0 69 63 17 0 Culinary (RestaurantManagement) 107 0 28 25 5 0 Homework Assistance 107 0 69 63 73 0 Parent University 21 19 3 8 0 7 Sports 1 107 0 69 63 18 0 Strategic Learning 107 0 41 38 13 0 Appendix 2: Changes in Reading Grades and Math Grades during the cycle on the part of participants. Klein Intermediate - Year End Grades Total Participants Increase No Change Decrease No Change Necessary Reading 92 12 56 22 2 13% 61% 22% 2% Klein Intermediate - Year End Grades Total Participants Increase No Change Decrease No Change Necessary Math 92 15 55 20 2 16% 60% 22% 2% Wunderlich Intermediate - Year End Grades Total Participants Increase No Change Decrease No Change Necessary Reading 120 11 73 14 22 9% 61% 12% 18% Wunderlich Intermediate - Year End Grades
  45. 45. Waits Consulting 40 Total Participants Increase No Change Decrease No Change Necessary Math 120 8 61 28 23 7% 51% 23% 19% Eiland ES - Year End Grades Total Participants Increase No Change Decrease No Change Necessary Reading 128 11 77 10 30 9% 60% 8% 23% Eiland ES - Year End Grades Total Participants Increase No Change Decrease No Change Necessary Math 128 21 63 14 30 16% 49% 11% 23% Epps Island ES - Year End Grades Total Participants Increase No Change Decrease No Change Necessary Reading 189 21 101 16 51 11% 53% 8% 27% Epps Island ES - Year End Grades Total Participants Increase No Change Decrease No Change Necessary Math 189 43 96 13 37 23% 51% 7% 20% Greenwood Forest ES - Year End Grades
  46. 46. Waits Consulting 41 Total Participants Increase No Change Decrease No Change Necessary Reading 129 19 49 35 26 15% 38% 27% 20% Greenwood Forest ES - Year End Grades Total Participants Increase No Change Decrease No Change Necessary Math 129 10 44 48 27 8% 34% 37% 21% Kaiser ES - Year End Grades Total Participants Increase No Change Decrease No Change Necessary Reading 79 14 53 7 5 18% 67% 9% 6% Kaiser ES - Year End Grades Total Participants Increase No Change Decrease No Change Necessary Math 79 9 47 14 9 11% 59% 18% 11% Klenk ES - Year End Grades Total Participants Increase No Change Decrease No Change Necessary Reading 93 5 45 36 7 5% 48% 39% 8% Klenk ES - Year End Grades
  47. 47. Waits Consulting 42 Total Participants Increase No Change Decrease No Change Necessary Math 93 8 44 24 17 9% 47% 26% 18% McDougle ES - Year End Grades Total Participants Increase No Change Decrease No Change Necessary Reading 140 14 83 13 30 10% 59% 9% 21% McDougle ES - Year End Grades Total Participants Increase No Change Decrease No Change Necessary Math 140 15 81 14 30 11% 58% 10% 21% Nitsch ES - Year End Grades Total Participants Increase No Change Decrease No Change Necessary Reading 80 7 45 13 15 9% 56% 16% 19% Nitsch ES - Year End Grades Total Participants Increase No Change Decrease No Change Necessary Math 80 23 31 12 14 29% 39% 15% 18%

×