3. Background
Title I, K-12 School District in Westfield, Indiana
Mission Statement: To provide meaningful and
engaging work in the pursuit of profound learning
Vision: To be the world-class learning organization
focused on continuous quality growth for all
4. Background on Moodle
Moodle is our learning management system. It is developed from a
learning-centric perspective. It enables teachers to easily set up
interactive, online class communities and materials for classes. This
enables students to not only learn within the classroom walls but outside
the classroom walls as well. This online learning community is in a
secure environment.
Moodle was instituted at Westfield-Washington Schools in the 2008-
2009 school year. Not only is Moodle used in K-12 schools, but it is also
used at the college/university level as well as in the business
industry. By implementing this learning tool in our district, we are
preparing Westfield-Washington Schools’ students for the real world.
6. 1b. How does the project support
our goals?
Goal: Reduce gap on NWEA Goal: Provide cost effective,
Goal: Reduce summer
RIT scores from Spring to Fall valid and engaging summer
learning loss in all areas for
learning for Westfield
students in grades K-8
families
Strategy: Create free and Strategy: Provide feedback
interactive summer and collect data throughout
learning vehicle with the summer for participating
Moodle students
Initiative: Provide
Initiative: Use core
monitoring and feedback to
standards and district
students within Moodle
technology experts to create
using Westfield Teacher
Summer R.O.C.K.S. program
experts
Output (Measure)- Compare NWEA RIT scores from
Spring to Fall after completion of Summer R.O.C.K.S.
Moodle program
7. 1c. How were key stakeholders
identified?
Key Stakeholders Why chosen
Students -Need to retain essential standards
-Need to pass ISTEP+
-Need to reach NWEA RIT growth goals
-Need 21st Century learning exposure
Monitoring Teachers -Provide feedback to improve/change programming
-Continuously update content to meet needs of participating
students
-Collect and analyze data
Next-In-Line Teachers -Use data to guide classroom instruction
-Compare previous year’s data to show summer learning loss gap
closing
Administrators -Need to increase student performance
-Raise ISTEP+ passing rates
-Raise NWEA RIT scores to meet growth targets
8. 1d. How are the key stakeholders
impacted?
Key Stakeholder How are these stakeholder impacted?
Students -Improved academic performance
-Better prepared for upcoming school year
-Close the summer learning gap
-Improved technology skill set
Monitoring Teachers -Validation from feedback and level of
participation by students
-Use differentiated instruction and increase
knowledge of best practices
-Increased awareness of creating Moodle activities
Next-In-Line Teachers -Less time spent reviewing previous year’s content
-Increased awareness of incoming students’ academic skills
Administrators -Increased student performance
-Raised ISTEP+ and NWEA RIT scores
-District-wide consistency
9. 2a. How was the root cause
identified?
Student takes Summer Student takes Fall
Spring NWEA over Break NWEA to build on
grade level skills previous skills
Due to summer regression, teacher spends weeks
reviewing previously taught material
Problem Statement
11. 2a. How was the root cause
identified?
Potential Root Cause:
The lack of student learning
opportunities results in
academic regression during
the summer.
12. 2b. How did the team use data and
information?
Used soft data such as teacher observations on how
students were performing on previously learned standards
Used NWEA test scores and compared previous spring
score to current fall score indicating learning loss
Used district CIC (Common Instructional
Checks) to compare retention of previous grade level skills
13. 2c. What key data was identified in
team’s analysis?
14. 2d. How did the team ensure it
identified the actual root cause?
Root Cause:
Students regress over the summer due to
lack of learning opportunities
Validated by:
Analysis of soft and hard data of a
sample class following their progress
over two years
15. 3a. How was the final solution selected?
Our Corporation recruited a team of teachers from each grade
level, IT experts and administrators to evaluate and enhance the
current summer program.
Team Discussion :
Where we have been, what
results do we want and how do
we obtain those results?
Answer: Start the process by
reviewing past initiatives with
an eye on the future.
16. 3a. How was the final solution selected?
Year One
Year Two
- Paper Packets or
- Basic Moodle
Purchased Workbooks
Mission:
Stopping the
“Summer Slide”
Year Four and Beyond
Year Three
- Where do we want to
- Enhanced Moodle
go
17. 3a.- How was the final solution selected?
More of What We Want: Less of What We Do Not Want
•Deep knowledge of core standards •Learning loss from summer
•High level of participation among regression
students per grade/activities per grade
•Students starting the new school
•Summer-long participation year behind on academic skills
•Learning assessments •Significant cost or demands on
•Continuous improvement of program families
•Cost-effective and accessible program
18. 3a. What are potential solutions?
Year One 2008 and before Reflection
Where We Were: Teachers copied random activity worksheets for
students to complete over the summer and/or schools had vendor
supplied workbooks for purchase.
Key Points:
*Easily created by teachers
*No accountability for students or teachers
*Copying costs to make packets; not always able to be located by
students when school started in the fall
*Difficult for teachers to grade
*Not necessarily directly linked to standards
*No lead team in charge for grade level or schools
Team Discussion Results: Packets do not provide enough
reliability or engagement for students; difficult for data to be obtained;
no secure numbers on student participation
19. 3a. What are potential solutions?
Year Two 2009 Reflection
Where We Were: Basic Moodle was introduced in our
district; select teachers decided to use this online format
Key Points:
*Available to 5th and 6th grade students
*Only quizzes were provided
*No monitoring or reporting
*No learning enrichment (no new content)
*Minimal cost; teacher created quizzes and students accessed
site on their own or through library Internet
*Not familiar interface to students
*Single session quizzes; no reason to return to site
*Able to collect some data on student participation; about 75 students
Team Discussion Results: Not enough interaction with other
students or teachers, data was limited, students did not return to the
site throughout the summer months
20. 3a. What are potential solutions?
Year Three 2010 Reflection
Where We Were: Enhanced Moodle site with more activities
and monitoring based on the results and feedback we received
from the previous year
Key Points:
* Students were familiar with Moodle interface
* Available to only 4th-6th grade students
* Added content; 30 assignments/activities for reading and math
* Easy to navigate for students
* More engaging activities
*Teacher monitoring
* Reporting/Trending of results
* Increased student participation
Team Discussion Results: Greater student participation, students began
meeting each other before the school year started via Moodle chats and
messaging, usable data for teachers and corporation
21. 3a. What are potential solutions?
Year Four 2011 Identified Major Enhancements
Key Points:
*Expanded Summer R.O.C.K.S. to include K-8 students
*Sought student and parent feedback
*Increased teacher lead team numbers
*Teachers monitoring student participation daily
*Separated activities into months which included additional
activities, games, lessons, videos and end of month
benchmarking quizzes
*Increased student and parent awareness through websites,
flyers and newsletters
22. 3b. How was the final solution selected?
Needs: Cost Student
-------------------------- Effectiveness Access Accountability Interest
Summer Learning Loss
Program :
Copied Paper Packets
Purchased Workbooks
Basic Online Quizzes
Enhanced Online
Activities, Lessons, and
Quizzes
= Poor = Good = Excellent
25. 3d. Final Solution- What were expected
benefits and how were they determined?
26. 4a. How was the solution’s
implementation planned?
A team of teachers was asked Teachers used state and Teachers were recruited to
to worked together and build district standards to create monitor and collect data
the updated Summer lessons, activities, and quizzes throughout the summer on
R.O.C.K.S. program for students the Summer R.O.C.K.S. site
March 2011 March and April 2011 April and May 2011
Our district informed
Students were excited and
Summer R.O.C.K.S. began to students, families and
began using Summer
collect data for 2011 community centers about our
R.O.C.K.S.
June 2, 2011 Summer R.O.C.K.S. program
May 2011
May 2011
27. 4b. What was the process for
implementing the solution?
28. 4c. How did the team achieve support for
implementing the solution?
All classrooms were visited by district technology experts to
explain how Summer R.O.C.K.S. worked, how to access it and to
share some of the great results from the past.
Families and the community were provided the information via
newsletters, blogs, Internet sites and flyers.
We have data to support increased participation in our program:
<2008 = Unknown number of participants
2009= 75 participants
2010= 500+ participants
2011= 1,016 as of 6/10/11
29. 4d. How will the results be measured
and sustained?
Record number of
students participating
Reassess the program Measure amount of time
each year for areas of students are on the
improvement Moodle site
Compare the NWEA test Count the number of
scores from the spring to activities and
the fall to assess student assignments students
retention of skills completed
Collect data of their quiz
results
30. 5a. How were our team members
selected?
Executive Director of Learning Systems- Leader
Grade Level Teachers were chosen for their:
Technology expertise
Commitment to 21st Century learning
Leadership within district
Knowledge of data-driven learning
Central office Staff chosen for their:
Database Coordinator
Technology expertise
Knowledge of data retrieval process
Instructional Technology Coordinator
Technology expertise
Data research and analysis
Knowledge of online learning platforms
31. 5b. How were our team members
prepared to participate?
Prior to our project for School Improvement we:
Trained team members on Moodle
Participated in Summer R.O.C.K.S. planning meetings
Created Summer R.O.C.K.S. learning platforms for specific
grade levels
During our project for School Improvement we:
Regularly checked and updated data using real-time information
from student usage of Summer R.O.C.K.S.
Participated in open dialogue with each other about program
success and needs for improvement both in meetings, via video
and through email communication
Consistently compared and shared data as a catalyst for
discussion and change
32. 5c. How did our team members contribute?
Team Members Major Contributions
Maria Esterline •Scheduled and facilitated meetings
•Oversaw project
•Gathered feedback and collected data
•Produced graphs and comparative data
•Supported team efforts and project
Jodi Dubovich •Analyzed data from team members
•Co-created Summer R.O.C.K.S. learning platform for grade 4
•District technology leader
Beth Bielefeld •Compiled data and information on summer learning loss
•Analyzed data from team members
•Created Summer R.O.C.K.S. learning platform for grade 5
•District technology leader
Jennifer McAndrews •Compiled data and information on summer learning loss
•Co-created Summer R.O.C.K.S. learning platform for grade 4
•District technology leader
Aaron Byers •Co-created Summer R.O.C.K.S. learning platform for grade 2
•Expert on Core Learning Standards
•District technology leader
33. 5d. How was our team performance
managed?
Our team created specific norms for effectiveness:
Meeting times- Bi-weekly to weekly,
agreed upon times, video feed for review
Confidentialty- Data and information
vital to success of the project kept within
our team
Expectations- each member had
specific tasks and deadlines by which to
share information
Communication- team members
communicate via email, phone, text and
in person
36. 6a. What were the results?
Key Stakeholders Benefits/Evidence
Students -Enthusiastic return to school/teacher report
-Less summer learning loss/data analysis
-Recovered instructional time/principal observation
-Increased technology skill/teacher observation and parent
comment
Monitoring Teachers -Greater familiarity with standards/teacher report
-Continued student relationships outside of the
classroom/teacher report
-Enhanced skills in web-based learning/ principal observation
Next-In-Line Teachers -Recovered instructional time with less review/teacher report
-Receive “better prepared” students/teacher report
-Opportunity for higher level technology based instruction due
to enhanced student skill/teacher report
Administrators -Opportunity for innovation through greater skilled teachers in
web-based learning/district level discussion
-Better served students/ district level discussion
-Higher student performance/district level discussion
40. 6c. How were the results shared?
Parents
Brochures
District
websites
Teachers Community
Faculty meeting District website
classroom visits
Results
Shared
District Students
LDT and School School wide
Board monthly celebrations
meetings Classroom visits
41. Project Summary
The Westfield-Washington School District has developed a
no-fee summer enrichment program for our students.
This program, Summer R.O.C.K.S., fills the void in the
academic calendar and supplies interesting, self-paced and
fun learning opportunities for students in grades K-8th.
Our unique Summer R.O.C.K.S. program can be replicated
and customized by other schools in the United States and
internationally.
Each participating team member has expanded his or her
own technical skills and capabilities.
42. Next Steps
Continuous Improvement:
*Assessment of our program effectiveness
through NWEA Fall testing
*Compare NWEA Fall scores against the
program content
*Focus on problem areas where scores were
lower than projected
*Evaluation from student feedback for what
was liked and not liked
Westfield Washington Schools began their continuous improvement journey in 2005. Mission and vision setting are important processes. World-class learning is defined as
“World-class continuous quality growth’” is a vision setting standard for Westfield schools. Operationally defined, “world-class continuous quality growth” means that Westfield students will experience non-interrupted growth and improvement enabling them to compete with students in the most successful American schools as a proxy measure that WWS students are prepared for their yet undefined, ever changing future needs in global post high school settings. WWS uses the nation’s largest Growth Research Data Base as baseline indicators of student growth, Northwest Evaluation Association (NWEA). NWEA shows across the nation that most students fall more than two months behind over the summer (National Summer learning coalition) making summer learning loss a hot topic across the nation- even in suburban, typically high achieving schools. The graph shows WWS loss using RIT scores in LA, Rdg, Math from spring to fall on NWEA testing. This is a continuous trend that our traditional summer learning methods were clearly not reducing. For the WWS Title 1 population, it is important to note that those children will typically falling even further behind – sometimes two to three months in reading each summer. (National Summer Learning Coalition) Key things we needed to keep in mind were accessibility, affordability and of course, content.
We wanted to assure that the project goals would address our vision for world-class learning and our mission of non-interrupted continuous learning and growth. Our project focused on reducing the NWEA learning loss gap, in an engaging, yet cost effective, approach, that would serve all K-8 students in all content areas.
The four key groups were initially identified through district-level strategic planning discussions which included root cause discussion within the school improvement process and study of variation across student groups in the unacceptable trends in identified key performance areas: at that time, primarily ISTEP+ and NWEA growth. District administrators identified a need for students to be defined as all k-8 students since trends indicated that a summer intervention capable of engaging all students at all skill levels would have greatest impact and was aligned with district vision of “continuous quality growth for all.” A grades 5-6 pilot showed promise of a model capable of reaching out to all students, an improvement in the practice of former summer interventions for select students. The pilot also showed initial success in reducing the gap between spring assessment and returning fall assessment through NWEA and an emerging, locally designed assessment (CIC).The Summer Learning Loss “Moodle” team was led through a review of those discussions and experiences by the district level representative who had been involved in both. Brainstorming by team members targeted students as most important stakeholders and both monitoring and “next-in-line” teachers as stakeholders who could both support the efforts and benefit from the results.
Team members, through naturally occurring conversation with colleagues, gathered and shared with each other the wishes of “monitoring” and “next-in-line” teachers. Brainstorming a list of desires for those two groups guided the work in assuring that Summer Moodle would present to students a combined set of experiences representing work of the just completed year and work to be expected in the students’ upcoming year.To maintain the team’s focus on desired results, team members thought of Parents/Families as coinciding with students, as their satisfaction is dependent on their child’s performance.
Team members, as members of the WWS staff, had experienced many data reviews (sample slide 11) of student performance as they left for summer and student performance as they returned from summer. The building and team level school improvement process, a Plan Do Study Act model, had captured those discussions and identified dissatisfaction with continued spring to fall gaps. The 5th-6th grade pilot team and then the Summer Learning Loss team did a review of the variety of summer interventions being offered across the district before 2008 and found that across buildings there were no consistent approaches and even within buildings the approaches to summer learning loss were random, changing from summer to summer. However, there was consistency in the district that lack of student learning opportunities resulted in summer learning loss, and that lead to delayed fall instruction while previously taught learning objectives were reviewed.
This trend data review is typical of the review that not only the Summer Learning Loss team conducted but typical of the building and team level review of student performance as part of the district’s PDSA School Improvement Process. The five year trend at a key transition point in the district where six K-4 buildings become one 5-6 building showed a continuing gap in the retention of previously learned skills and when studied as subgroups showed unacceptably high levels of variation of student performance influenced by the student’s home school. The team’s analysis supported that variation of processes and limited number of students reached with those processes were the root cause of no consistent improvement in reducing summer learning loss over time.
The root cause in the summer learning gap is the lack of a program that students can manage at their own pace, from their summer location (home, daycare, vacation, etc.), that is interesting and fun enough to compete with their other summer activities. To address the root cause, we created the "Summer R.O.C.K.S" program.
In the soft data review of teachers observing student performance and shared with each other in collaborative discussions guided by trend performance data. Areas of strength performance were linked to interventions such as a targeted group of students who had an intensive summer learning experience (i.e. on site intervention student groups) and areas of weak performance were linked to indicators that students may be experiencing little enrichment in the summer (i.e. free/reduced lunch and/or second language demographics). Conclusions were drawn that a lack of guided learning experiences do promote summer learning loss.Team members used benchmarking provided through national norm tables to identify what spring to fall performance patterns are typically for WWS and for the nation and to establish desired performance patterns. In addition, the district’s evolving formative assessment model known as Common Instructional Checks was used as comparative data to see if NWEA trends were reflected in the retention of Essential Skills as the targeted measure of CIC’s. The team concluded that both NWEA and CIC fall performance would be impacted by more students engaging more often in summer learning experiences.
This is a sample class following the same students over two summers. Each summer students in the sample class showed significant NWEA RIT score reductions from spring to fall in those students who did not participate in summer learning programs. Due to the drastic drop in scores (representing increasing levels of summer learning loss when compounded over more than one summer), a new initiative was necessary to maintain the momentum of continuous growth from the previous year into the following year. The team triangulated trend data of five years (slide 5) in which summer regression was a constant with a specific five year look at a key point of transition in the district (slide 11) and with the impact of summer learning loss over time of students who do not have summer learning experiences. Comparative data at other grade levels suggested that this was a district trend across K-8 grade levels as well. The triangulation and the strong consistency of summer regression patterns for students with no guided summer learning experiences validated the team’s identification of root cause, our next slide.
Analysis of the trend data and the evidence that trends in the middle grades (3rd, 4th, and 5th) were representative of unacceptable trends at other grade levels suggested that summer learning opportunities needed to improve if summer learning loss was to be reduced. The team reviewed potential interventions which had been tried in the past, including paper/pencil packets, on-site interventions for select students, and grade specific computer-based experiences such as early Moodle attempts as opportunities for enhanced summer outreach if redesigned and scaled up. The team determined that only a district-wide, consistent, fully scaled for all students, convenient “at-home” access, capable of data-rich analysis approach had the greatest potential to most quickly and most successfully stop summer learning loss.
Solution Development: As a district, Westfield- Washington Schools had been working within a five year strategic plan focused on continuous growth since 2006. In several cycles of improvement informed through run chart analysis of variance in student achievement data trends, a variety of identified interruptions in the continuous learning process highlighted summer learning loss as a consistent and significant interruption across grades kindergarten through eight. Through a Plan Do Study Act process, summer interventions were examined and unacceptable variation in methodology and results across the district seemed to impede desired results. The “team” was selected to assure that teachers most successful in informing practice through data analysis, most talented in technology driven instruction, most familiar with web-based platforms, and most skilled in developing curricular-aligned student work would be working together. The team was given the challenge of developing a district standardized summer enrichment program that would:Reduce the identified variation in previous summer interventionsReach many more studentsShow evidence of reducing summer learning loss as measured by district Common Instructional Checks and NWEA RIT spring to fall indicators.
Each year Westfield educators reflected on the summer learning practices offered and made changes according to the data we received- our goal always being to close the summer learning gap and improve retention of previously learned skills.
The district in its study of continuous growth and its efforts to identify interruption to continuous learning began exploration of potential summer outreach solutions. Random summer solutions were offered and within the school improvement PDSA at building levels, teachers and administrators reviewed student participation rate, evidence of reduced summer learning loss, and teacher satisfaction with the efforts. Teacher teams determined lists of considerations to reference throughout the district’s search for an impactful summer outreach: such as paper/pencil packets, individual teacher attempts at web-based outreach, and eventually a Moodle-based pilot.Teachers knew that they wanted a data-driven and effective summer on-line program focusing on student retention of information that could have a profound effect on students through fun, interactive and cost-effective means.
Both teachers and students now more knowledgeable due to usage on a regular basis throughout the district for lessons, activities and Common Instructional Check quizzes. Focused discussions using what data was collected and teacher brainstorming of what was working as well as what would most likely work led the team to believe that we were on the right track with a delivery system that could scale to reach “all” students, could provide data for improvement efforts, and that would engage our students. Small case studies suggested that the outreach would successfully combat summer learning loss.
The team, through brainstorming and improvement discussions around student participant data which indicated differences in early and late summer participation as well as time spent by individual students working in activities, and correlation studies of summer interventions with student, determined that by enhancing the site with additional activities, forums and quizzes, we would be able to excite the students to return to the site throughout the summer and collect the data results we wanted to achieve. Teachers volunteered to create and monitor the site. The team is focused on where we want to go and the process of how to obtain our desired results.
Following the team’s analysis of affordability, accessibility, accountability, and student interest, the team brainstormed ideas for how to translate their understanding of the historic offerings which had been limited in scale and the online pilots that had been limited in outreach into a district-wide offering that would have high impact. Evidence from the 5th/6th grade pilot and a district survey through the technology department suggesting that 90% of our families had online access from home guided the team in their determination that the district could maximize results by enhancing online opportunities with increased teacher-created content and a summer month by month framework for encouraging participation. A direct link between online learning and reduced summer learning loss was validated through the team’s analysis of the performance of the incoming 5th grader pilot group. The analysis included a comparative study of students who participated in Summer ROCKS and those who did not. The team determined that Summer R.O.C.K.S., as we now called it, was the best venue to provide summer enrichment to the most students. By having the program on-line, students would be able to access the site from wherever they happen to be as long as they had Internet access. This allows students visiting relatives or on vacation to continue their summer learning. Summer 2011 results, currently being analyzed, indicate that the success of the pilot group is being replicated across more grade levels and with more students. NWEA, a nationally normed benchmark, provided independent verification external to the project which showed objective evidence our Summer ROCKS program is having a direct impact on student retention of previously learned material and continuous student growth.
For the final solution, the team expected that more students could dramatically reverse summer learning loss. A team goal was that we would see at least a 50% reduction in summer learning if a student “actively” participated. A surprise was that many students not only stopped summer learning loss, but, as this one case study (not atypical of many other students) showed, students actually reversed summer learning loss resulting in summer learning gain. These snapshot samples indicated that with increased participation more students could demonstrate this level of learning.
March 2011, the Summer R.O.C.K.S team was challenged to raise the level of student participation at all grade levels at least by 50% and to see reduction in summer learning loss across the district at 25%.March/April 2011, the challenge was to assure that online student work would align to standards so that assessed learning targets would be reliable and valid. A significant challenge was for team members to challenge each other in their assessment of that alignment. Works sessions, email chats, and continuous review were the methods used in this important task.April/May 2011, the team determined a profile of the teacher who would be best in the roles of stipend-paid summer support including: technology savvy, passionate about online learning, strong data analysis skills, strong student relationships, and capacity to self-manage. The team brainstormed a list of suggested teachers and district leadership, after review, offered opportunities to select teachers.and 5. May 2011, an important process was for team members and administrators with technology backgrounds to meet classroom by classroom to introduce students to Summer Moodle. Those visits were supported through online, distributed hand-outs, and newsletter outreach to parents. All students left for the summer at least registered.June 2011, the district teacher leader, a member of the Summer R.O.C.K.S team, worked with district level technology teams and the district’s data team to create a framework for data collection and analysis. Improvement criteria identified through early Summer Moodle efforts were set: 1) daily collection on large numbers of data, 2) specific data by time of day, activity completed, etc 3) ability to track interaction of student to student and student to teacher, 4) ability to analyze at building level, at grade level, and at individual student level, and 5) ease of triangulating data with student performance data.
After teachers created survey sheets for the students about what they liked about previous Summer Moodle sites and what they would like to see included, student interests were at the center of collaborative efforts resulting in a 00% increase in available activities. The process included individual teacher searches for links that could be tied directly to standards, sharing with other team/district members for affirmation of student interest and link to standards; organizing the online offerings by grade level, by availability on a monthly basis, and by groups to be monitored by stipend paid staff; and planning the classroom by classroom outreach described previously.
The R.O.C.K.S team encouraged buy in with all stakeholders and across the district as a whole through individualized strategies:For the students, after initial introduction, teachers trusted that now that most students were aware of the Moodle system, students themselves would shared and encourage other students to participate. That proved to be very successful.For the monitoring teachers, recognition was embedded in building and district leadership opportunities by the Executive Director of Learning Systems who supported the efforts, recognized the teachers for their “above and beyond” work at building level meetings across the district, at district level work sessions, and often in public School Board meetings, as well as seeking a stipend for the work. The R.O.C.K.S team supported one another and monitoring teachers through shared expertise, shared success celebrations, district staff meetings, building level faculty meetings, grade level team meetings, and district administrators spoke highly about the promising results emerging through the pilot efforts.For the “next-in-line” teachers, demonstrations in faculty meetings introduced the concept of “recovered instruction” if students returned in the fall in need of less review. In order to continue support of “next-in-line” teachers and to develop their enthusiasm for launching the next Summer Moodle experience, the district data team and R.O.C.K.S team prepared detailed graphing of student performance at the building level, at the grade level, at the classroom level, and at the student level at the end of each summer experience. One of the most powerful strategies was a class roster for the “next-in-line” teacher detailing which of their new students that fall had participated and at what skill level were they entering the classroom.Administrators were informed not only of participation rates but of student achievement throughout the summer by email, by district-led discussions, and through the superintendent’s support of the efforts.It was soon true that buy-in was a given across K-6. Strategies are being studied for improving grades 7-8 participation.
We are able to measure our results directly from our Moodle site. The site includes tracking of total participants, number of student log-ins, assignment grading results and quiz results throughout the entire summer break. Students’ test scores from our district’s NWEA testing will confirm our data. We plan to sustain our program through routinely assessing our process and results and making adjustments where necessary.Student participation is counted as not only how many unique student enrollments are captured, but how many students return to the site, and how often across a full summer timeline students interact with the site. This detailed of a study inspired the improvement of releasing “activities” on a monthly basis across the summer to introduce novelty over the summer months.Student length of time “online” each time they visit the site is captured and the team studies the patterns of behaviors to determine interest and/or challenge level of particular activities to determine what “types” of activities seem to provide the most challenge and/or interest so replicated activities can be found or developed.A comparative study is conducted at the district level at the close of each summer to determine the correlation of levels of participation with student summer learning growth. At what level of number of completed tasks do we see most growth?Quiz results are most helpful to “next-in-line” teachers for determining incoming skill level of their new students. This, of course, adds to buy in of those teachers.The spring to fall gap is the primary indicator of improvement in student growth levels and has become the primary reason for district and teacher support of this effort.The R.O.C.K.S team is the process owner for year to year improvement using the data collected, teacher input from across the district, and, most importantly, intentional collection of student feedback of those participating.
The Executive Director of Learning Systems was the primary process owner for developing a summer intervention. Working with district leadership, he gathered input as to what resources would be required for such a vast undertaking. He worked with the district Technology Systems Group to identify technology skill level that would be needed, with building principals to determine the student relationship requirements, and with lead teachers to determine the level of commitment and passion that would be needed to sustain long term efforts. After profiling the type of team member needed, he sought recommendations from building principals and building teacher leaders. He spoke with teachers suggested for the role and invited those whom he felt most suited for the role.
Most training was at the skill level needed for building and monitoring a web-based program. However, the Executive Director of Learning Systems through his work with the district Instructional Technology Coordinator established protocols for working together. An important district support was training and support in analyzing student performance data as a correlation study to student participation indicators. The district data team prepared a variety of analysis and met with lead team member to guide interpretation.
The Executive Director of Learning Systems provided oversight for team building, team protocol (open sharing; management of timelines; fact driven decision making, such as student interest surveys for guiding design improvements); and process adherence including reporting out to building supervisors.
The team discussed their individual needs and their individual capacity for commitment to the project. By listing the individual needs and then building a scale of value, each team member had full input as to how they would manage the task. Meeting times that were reasonable, respect their colleagues in that comparative data would reveal variations was an important professional consideration, clear direction was agreed upon, and real time communication would be key. The team managed through checks and balances, oversight from the district director, and principal support for balancing other responsibilities.
Students who participated in Summer Moodle showed significantly less regression on their NWEA RIT scores from We know that our Reading results were . . .
Students who participated in Summer Moodle showed significantly less regression on their NWEA RIT scores in Reading from Spring to Fall than those students who did not participate in the Summer Moodle program.
Student B, who did not participate in Summer Moodle, showed no gains, or a regression in NWEA scores in all areas.
We have created a model…..other schools love us
We have created a protoype of educational outreach beyond the school walls that can keep our students engaged at any time of day, any of the month of the year, and from any location in the world.
Comparing enrollment and participation over the years, we have already seen significant increases in enrollment and task completion with our Summer R.O.C.K.S. 2011 program.
Results have been shared in many ways. On our district website, visitors can access our district scorecard with NWEA growth for schools. Parent nights were held to inform families about the benefits and gains to be had for students who participate in our Summer Moodle program. Email communication and informational literature are also sent home to inform families of their child’s growth. Teachers are given access to student growth information with varied online resources (Inform, NWEA site, PowerSchool). Monitoring teachers share results of Summer R.O.C.K.S. program with participants giving immediate feedback
Bridging the summer gap is an essential element of learning retention. Through the use of innovation web-based tools, we are engaging young minds over the summer months and are actively monitoring participation at each grade level.
We believe we are on the right track with this year’s enrollment of excess of 1,000 students. We will be looking forward to reporting next year’s standardized testing to confirm the effectiveness of our our program. The evaluation for 2011 of student feedback is enhanced by a considerably larger population of students.
Comparing enrollment and participation over the years, we have already seen significant increases in enrollment and task completion with our Summer R.O.C.K.S. 2011 program.