This document discusses using data wisely from a superintendent's perspective. It covers three main topics: assessment basics, improving assessment programs, and developing a data culture. The document emphasizes that what is measured gets attended to, so assessments must be properly aligned and designed. It also stresses using multiple years of data to provide context and control for outside factors to fairly evaluate teachers. Developing the right assessment systems and using data thoughtfully can significantly improve student achievement.
www.earnperhit.com/essay => Professional academic writing
www.Lucky-Bet.site => Bet on Sports - 50% Deposit Bonus
www.Lucky-Bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.Lucky-Bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.Lucky-Bet.site/eurobet => Best European Bookmaker
Discussion ab out trends in assessment and accountability for National Superintendent's Dialogue
Using Assessment Data for Educator and Student GrowthNWEA
www.earnperhit.com/essay => Professional academic writing
www.Lucky-Bet.site => Bet on Sports - 50% Deposit Bonus
www.Lucky-Bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.Lucky-Bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.Lucky-Bet.site/eurobet => Best European Bookmaker
This presentation reviews major topics to be considered when using assessment data in implementing a school's program of educator and student growth and evaluation. By attending this workshop, participants will improve their assessment literacy, learn how to improve student achievement and instructional effectiveness through thoughtful data use, and discuss common issues shared by educators when using data for evaluative purposes.
www.earnperhit.com/essay => Professional academic writing
www.lucky-bet.site => Bet on Sports - 50% Deposit Bonus
www.lucky-bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.lucky-bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.lucky-bet.site/eurobet => Best European Bookmaker
Overview of assessments, growth, and value added in a teacher evaluation context
www.earnperhit.com/essay => Professional academic writing
www.Lucky-Bet.site => Bet on Sports - 50% Deposit Bonus
www.Lucky-Bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.Lucky-Bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.Lucky-Bet.site/eurobet => Best European Bookmaker
Discussion ab out trends in assessment and accountability for National Superintendent's Dialogue
Using Assessment Data for Educator and Student GrowthNWEA
www.earnperhit.com/essay => Professional academic writing
www.Lucky-Bet.site => Bet on Sports - 50% Deposit Bonus
www.Lucky-Bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.Lucky-Bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.Lucky-Bet.site/eurobet => Best European Bookmaker
This presentation reviews major topics to be considered when using assessment data in implementing a school's program of educator and student growth and evaluation. By attending this workshop, participants will improve their assessment literacy, learn how to improve student achievement and instructional effectiveness through thoughtful data use, and discuss common issues shared by educators when using data for evaluative purposes.
www.earnperhit.com/essay => Professional academic writing
www.lucky-bet.site => Bet on Sports - 50% Deposit Bonus
www.lucky-bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.lucky-bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.lucky-bet.site/eurobet => Best European Bookmaker
Overview of assessments, growth, and value added in a teacher evaluation context
E assessment conference scotland 2014 presentation>
As technology evolves and becomes more integrated into education, the data trail created by learners is enormous. The analysis of this data referred to as “Learning analytics” drives learning in a cyclical pattern; data is collected, analysed, and interventions are made based on the data. After these interventions, more data is collected and analysed, and additional (perhaps different) interventions are made.
This presentation outlines how the data related to assessments is collected from three different projects within DCU and then analysed with the aim of improving the student learning experience. Each project has two common threads; making life easier for the lecturer and improving the experience of the student.
Data and assessment powerpoint presentation 2015Erica Zigelman
Presented for Datag in Albany, NY. This presentation is all about multiple types of data you may obtain within your classroom and how to assess your students.
Presentations morning session 22 January 2018 HEFCE open event “Using data to...Bart Rienties
With the Teaching Excellence Framework being implemented across England, a lot of higher education institutions have started to ask questions about what it means to be “excellent” in teaching. In particular, with the rich and complex data that all educational institutions gather that could potentially capture learning gains, what do we actually know about our students’ learning journeys? What kinds of data could be used to infer whether our students are actually making affective (e.g., motivation), behavioural (e.g., engagement), and/or cognitive learning gains? Please join us on 22 January 2018 in lovely Milton Keynes at a free OU- and HEFCE-supported event on Using data to increase learning gains and teaching excellence.
10.30-11.00 Welcome and Coffee
11.00-11.30 Lightning presentations by participants, outlining insights about learning gains
1130-1300 Insights from the ABC-Learning Gains project
Dr Jekaterina Rogaten (OU): Reviewing affective, behavioural and cognitive learning gains in higher education of 54 learning gains studies
Prof Bart Rienties & Dr Jekaterina Rogaten (OU): Are assessment scores good proxies of estimating learning gains: a large-scale study amongst humanities and science students
Prof Rhona Sharpe (University of Surrey) & Dr Simon Cross (OU): Insights from 45 qualitative interviews with different learning gain paths of high and low achievers
Dr Ian Scott (Oxford Brookes) & Dr Simon Lygo-Baker (OU): Making sense of learning trajectories: a qualitative perspective
E assessment conference scotland 2014 presentation>
As technology evolves and becomes more integrated into education, the data trail created by learners is enormous. The analysis of this data referred to as “Learning analytics” drives learning in a cyclical pattern; data is collected, analysed, and interventions are made based on the data. After these interventions, more data is collected and analysed, and additional (perhaps different) interventions are made.
This presentation outlines how the data related to assessments is collected from three different projects within DCU and then analysed with the aim of improving the student learning experience. Each project has two common threads; making life easier for the lecturer and improving the experience of the student.
Data and assessment powerpoint presentation 2015Erica Zigelman
Presented for Datag in Albany, NY. This presentation is all about multiple types of data you may obtain within your classroom and how to assess your students.
Presentations morning session 22 January 2018 HEFCE open event “Using data to...Bart Rienties
With the Teaching Excellence Framework being implemented across England, a lot of higher education institutions have started to ask questions about what it means to be “excellent” in teaching. In particular, with the rich and complex data that all educational institutions gather that could potentially capture learning gains, what do we actually know about our students’ learning journeys? What kinds of data could be used to infer whether our students are actually making affective (e.g., motivation), behavioural (e.g., engagement), and/or cognitive learning gains? Please join us on 22 January 2018 in lovely Milton Keynes at a free OU- and HEFCE-supported event on Using data to increase learning gains and teaching excellence.
10.30-11.00 Welcome and Coffee
11.00-11.30 Lightning presentations by participants, outlining insights about learning gains
1130-1300 Insights from the ABC-Learning Gains project
Dr Jekaterina Rogaten (OU): Reviewing affective, behavioural and cognitive learning gains in higher education of 54 learning gains studies
Prof Bart Rienties & Dr Jekaterina Rogaten (OU): Are assessment scores good proxies of estimating learning gains: a large-scale study amongst humanities and science students
Prof Rhona Sharpe (University of Surrey) & Dr Simon Cross (OU): Insights from 45 qualitative interviews with different learning gain paths of high and low achievers
Dr Ian Scott (Oxford Brookes) & Dr Simon Lygo-Baker (OU): Making sense of learning trajectories: a qualitative perspective
RtI/MTSS SPE 501-Spring 2021 Module 6 Adapted Assignment
Progress Monitoring Summary:
Step One:
Review all components of the IRIS Module on Progress Monitoring :IRIS
Step Two:
Write a three page typewritten double spaced summary of the Progress Monitoring process. Your summary does not need to include citations, just a clear summarization that shows your understanding of the process. Your summary should include the following points of the progress monitoring process.
A description of:
· The role of formative assessments
· The role of progress monitoring
· How progress monitoring measures are chosen
· The role of the graph of progress (hint: a goal line and a trend line (student’s progress) should be mentioned here)
· How data based instructional decisions are made
· How progress is communicated to pertinent staff and parents.
Step Three: RTI/MTSS Assignment - 501
The role of formative assessments is; this type of assessment occurs during instruction, that allows the teachers to decide if students are learning as the material distributed to the class. This intended process of assessing as learning is happening which permits teachers to adjust to the necessary instruction to meet the learning needs of their students.
Formative assessments provide vital information regarding a student's progress toward particular learning objectives, her comprehension of skills or material being taught and any misinformation she has.
This assessment permits teachers to make informed decisions about when to revise or reteach material or skills or to adjust instruction. Also it identifies students who are constantly struggling.
Progress monitoring is a kind of formative assessment that is utilized within the elementary, middle and highschool environment. Progress monitoring permits teachers to;
⦁ "Frequently and constantly evaluate student learning.
⦁ Monitor the effectiveness of their instruction
⦁ Make instructional changes to improve student's academic progress."
There are two kinds of progress monitoring; mastery measurement (MM) and general outcome measurement (GOM) which is often referred to as a curriculum based measurement (CBM).The (GOM) model is most commonly used for progress monitoring. Even Though, scores from reading measures evaluate a student's progress, the results aren't used to assign grades. When students' reading skills improve, so will their scores involving reading measures as well. Initially, the scores are low at the beginning of the year and scores progress over a period of time, which suggest they are learning.
There are many benefits to utilizing the (GOM). The role of progress monitoring also includes;
⦁ "Monitor student progress over time
⦁ Determine if the current instruction is assisting students to learn.
⦁ Determine if students are making adequate progress toward their learning goals
⦁ Identify students who aren't progressing adequate toward thei ...
The following slide deck highlights specific strategies teachers may utilize to enable students to develop assessment capabilities, a growth mindset, and the knowledge and skills to support others in their learning. This presentation was delivered at ASCD New Orleans 2016
Similar to NYSCOSS Conference Superintendents Training on Assessment 9 14 (20)
Dylan Wiliam seminar for district leaders accelerate learning with formative...NWEA
www.earnperhit.com/essay => Professional academic writing
www.lucky-bet.site => Bet on Sports - 50% Deposit Bonus
www.lucky-bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.lucky-bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.lucky-bet.site/eurobet => Best European Bookmaker
Dylan Wiliam, internationally recognized researcher, formative assessment expert and founder of Keeping Learning on Track® believes districts that want to improve academic performance should make embedded formative assessment a priority.
Assessment Program Alignment: Making Essential Connections Between Assessment...NWEA
Presented by Mark Kessler at the Arizona Assessment Summit.
This session introduces a processes to assist educators in building data literacy district-wide. Aligning the use of current school and district assessments and understanding the interrelationships of assessment, curriculum, and instruction are emphasized. Participants collaborate in establishing priorities for assessment practices and appropriate use of resulting data.
Predicting Student Performance on the MSP-HSPE: Understanding, Conducting, an...NWEA
www.earnperhit.com/essay => Professional academic writing
www.lucky-bet.site => Bet on Sports - 50% Deposit Bonus
www.lucky-bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.lucky-bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.lucky-bet.site/eurobet => Best European Bookmaker
Presented at Washington Educational Research Association (WERA) conference.
Presenters:
Highline Public Schools and Vancouver Public Schools
Sarah Johnson Sarah.Johnson@highlineschools.org
Paul Stern Paul.Stern@vansd.org
Presentation Overview:
- Background/The Value of Alignment Studies
- Highline’s Regression Study
- NWEA’s Linking Study
- Multi-District Regression Study
- Conclusions
- Applying the Results
Predicting Proficiency… How MAP Predicts State Test PerformanceNWEA
www.earnperhit.com/essay => Professional academic writing
www.lucky-bet.site => Bet on Sports - 50% Deposit Bonus
www.lucky-bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.lucky-bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.lucky-bet.site/eurobet => Best European Bookmaker
Predicting Proficiency… How MAP predicts State Test Performance
Paul Stern, District Enterprise Analyst, Vancouver Public Schools, Sarah Johnson, Accountability Project manager, Highline Public Schools, Burien, WA
Fusion 2012, the NWEA summer conference in Portland, Oregon
NWEA routinely produces “Linking Studies” that explore the alignment between the RIT Scale and state student proficiency exams. This presentation will share the results of an alignment study that applied a methodology developed by the Highline School District. The presentation will focus on how the results of the two methods differ and how Vancouver Public Schools will use this information to inform instruction and guide student interventions.
Learning outcome:
- Learn how to define proficiency using MAP cut scores.
- Understand the alignment of MAP to Washington’s State Assessments.
- Learn how alignment studies can be conducted and used to inform instruction
Audience:
- Experienced data user
- Advanced data user
- District leadership
- Curriculum and Instruction
Vancouver Public Schools serves approximately 22,000 students in Vancouver, WA, an urban/suburban district across the river from Portland. The presenter is the enterprise analyst within the Information Technology Services department focused on predictive analytics and performance measurement.
www.earnperhit.com/essay => Professional academic writing
www.lucky-bet.site => Bet on Sports - 50% Deposit Bonus
www.lucky-bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.lucky-bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.lucky-bet.site/eurobet => Best European Bookmaker
Connecting the Dots: CCSS, DI, NWEA, Help!
Eileen Murphy Buckley, NCTE author and Consultant, Chicago Public Schools, IL
Fusion 2012, the NWEA summer conference in Portland, Oregon
Participants will learn about how adopting the practices of close reading and evidence based argumentation emphasized in the Common Core State Standards can work seamlessly within a differentiated literacy program called CERCA. Through centers that promote engagement, independence, and rigor, students develop critical thinking skills, academic language skills, and practice the strategies and skills found throughout Descartes Continuum of Learning. As students move through centers designed to promote accountability for one's own learning and growth, teachers can strategically address individual and small group support and enrichment needs on a daily basis. The session is especially relevant for literacy in grades 5-8.
Learning outcome:
- Participants will understand the role of close reading and argumentation in increasing rigor and growth.
- Participants will understand the benefits of using a common language and shared practices for literacy in a system or school.
- Participants will understand how centers-based instruction can help teachers differentiate instruction on a regular basis.
Audience:
-Experienced data user
I have recently left Chicago Public Schools where I was the Director of Curriculum and Instruction for the AMPS Office (the office of Autonomous Schools.) The AMPS team brought the pilot of NWEA to CPS who has now adopted it system-wide. As part of the same team, we then led the Pershing Network within CPS. I helped schools evaluate, develop, and implement curriculum and instruction and professional development plans to help teachers help students meet growth targets and begin the implementation of CCSS through an evidence-based argumentation framework which 50 school grades 3-12 adopted.
www.earnperhit.com/essay => Professional academic writing
www.lucky-bet.site => Bet on Sports - 50% Deposit Bonus
www.lucky-bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.lucky-bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.lucky-bet.site/eurobet => Best European Bookmaker
What’s New at NWEA: Power of Teaching
Fusion 2012, the NWEA summer conference in Portland, Oregon
www.earnperhit.com/essay => Professional academic writing
www.lucky-bet.site => Bet on Sports - 50% Deposit Bonus
www.lucky-bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.lucky-bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.lucky-bet.site/eurobet => Best European Bookmaker
What’s New at NWEA: Skills Pointer
Fusion 2012, the NWEA summer conference in Portland, Oregon
www.earnperhit.com/essay => Professional academic writing
www.Lucky-Bet.site => Bet on Sports - 50% Deposit Bonus
www.Lucky-Bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.Lucky-Bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.Lucky-Bet.site/eurobet => Best European Bookmaker
Finding Meaning in NWEA Data
Eric Lehew, Executive Director, Poway Unified School District, CA
Fusion 2012, the NWEA summer conference in Portland, Oregon
MAP data reports can be overwhelming. Making sense of how to use Descartes can be daunting. This session will share strategies, teacher videos and other resources to support teachers on the use of MAP data and Descartes statements to inform instruction. Strategies for using MAP with students will also be shared.
Learning outcome:
- Instructional decision making with key MAP reports
- Managing and effectively Descartes as an instructional tool
- Engaging students as active participants in your MAP process
Map using district for over 10 years and have developed a variety of tools to support student, teacher and parent participation with MAP data.
Audience:
- New data user
- District leadership
- Curriculum and Instruction
With a vision to expand virtual learning to all students, Beaufort County Schools has adopted an “Everywhere, All the Time” approach to education. The virtual summer school is not only intended to reduce summer learning loss, but also to engage parents as their child’s “learning coach.” The presentation will address the creative thoughts behind the virtual summer school, the implementation and logistics of managing such a system, and results.
An Alternative Method to Rate Teacher PerformanceNWEA
An Alternative Method to Rate Teacher Performance
Patricio A. Rojas, PH.D. Director of Research, Data & Assessment, Los Lunas, NM
Fusion 2012, the NWEA summer conference in Portland, Oregon
This session will provide participants the opportunity to experience an alternative method of rating teachers, under new regulations of New Mexico. This is an updated version of the work presented last year in FUSION 2011. The alternative method is needed because we do not have growth points in the year 2010-2011 in New Mexico.
Learning outcome:
- Learn easy graphs to analyze growth and how to rate teacher performance without using grown points.
Los Lunas is located 35 miles south from Albuquerque, the district has 9,000 students; 17 schools (3 high schools, 2 middle schools, and 12 elementary schools). The district is one of the few nationally accredited districts in the nation. We have been using MAP as short cycle assessment for the last six years. MAP scores are an important piece of data used to rate both schools and teachers.
Audience:
- Experienced data user
- District leadership
- Curriculum and Instruction
www.earnperhit.com/essay => Professional academic writing
www.Lucky-Bet.site => Bet on Sports - 50% Deposit Bonus
www.Lucky-Bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.Lucky-Bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.Lucky-Bet.site/eurobet => Best European Bookmaker
Data Driven Learning and the iPad
Richard Harrold, Principal, ACS Cobham International School, UK
Fusion 2012, the NWEA summer conference in Portland, Oregon
ACS Cobham International School was one of the first schools to accompany its iPads implementation with a formal study of the effect on iPads in the affective and academic domains. This session will show how MAP data contributed to the study's conclusions and will provide participants with a tool to gauge the effectiveness of mobile technology in general and the iPad in particular. Using Engagement theory as a guide, ACS Cobham has completed a mixed methods study that will be of interest to schools exploring the potential of mobile devices to enhance both learning and affective domain behaviors. Educators keen to see how data driven goal setting can come alive for the iGeneration should attend.
Learning outcome
- How can the effect of mobile technology be objectively measured
- How can I make goal setting relevant to iGeneration students
Audience:
- New data user
- Experienced data user
- Advanced data user
- District leadership
- Curriculum and Instruction
ACS Schools combines three international schools on the outskirts of London, UK and one school in Doha, Qatar. The combined total of students is around 3,000. The three UK schools have been administering MAP since 2009. We use DesCartes and instructional resources across the district to guide instructional planning. Last year we began using NWEA Science tests for the first time. Our team includes our Assistant Head of School, the assistant principals from the Lower and Middle Schools, the assistant academic dean, a member of our IT support staff and three classroom teachers (one from each of the three divisions of the school using MAP).
www.earnperhit.com/essay => Professional academic writing
www.lucky-bet.site => Bet on Sports - 50% Deposit Bonus
www.lucky-bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.lucky-bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.lucky-bet.site/eurobet => Best European Bookmaker
21st Century Teaching and Learning
Sue Beers, Director, Mid-Iowa School Improvement Consortium, IA
Fusion 2012, the NWEA summer conference in Portland, Oregon
What are the skills students will need to successfully navigate the 21st century? What are the learning preferences of today’s learners? Participants will explore a model for 21st century instructional planning that integrates learner attitudes, motivation, and engagement; effective use of technology; subject area content; the three Rs (reading, writing and math); and the four Cs (creativity, critical thinking, communication, and collaboration.
Learning outcome:
- Identify the learning preferences and styles of today's learners.
- Examine a model for incorporating 21st century skills with literacy skills and content standards.
Audience:
- District leadership
- Curriculum and Instruction
MISIC is a consortium of approximately 160 school districts in Iowa, focused on developing tools and resources to help improve student achievement.
www.earnperhit.com/essay => Professional academic writing
www.lucky-bet.site => Bet on Sports - 50% Deposit Bonus
www.lucky-bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.lucky-bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.lucky-bet.site/eurobet => Best European Bookmaker
Thomas R. Guskey keynote address at Fusion 2012, the NWEA summer conference in Portland, Oregon.
"Grading and Reporting Student Learning"
You Want Us to Do WHAT????
Dr. Becky Blink, Data-Driven Instructional Solutions, LLC. WI
Fusion 2012, the NWEA summer conference in Portland, Oregon
Do you feel like your head is spinning with all the initiatives that have fallen into the field of education? This presentation will help you FUSE it all together MAP, common core, RTI, Odyssey (content partner to NWEA). Differentiated lesson plans will be shared; a newly designed template will be unveiled to help teachers create a plan for RTI intervention. These examples can provide you and your teachers with immediate practical applications to classroom instruction.
Learning Outcome:
- Participants will leave with an understanding of how to use MAP data to differentiate their universal classroom instruction.
- Participants will leave with an understanding of how to create their own lesson plan based on MAP data.
- Participants will leave with and overall concept of how MAP, RTI, common core standards, all fit together under one umbrella.
Audience:
- New data user
- Experienced data user
- Advanced data user
- District leadership
- Curriculum and Instruction
MAK MItchell keynote address at Fusion 2012, the NWEA summer conference in Portland, Oregon.
"Finding Ground Truth in Data:
Consensus Rules!"
MAK leads a consensus governance model for 900 principals of public schools and charters co-located on 380 campuses in New York City. In this keynote, she will tell the story of how her powerful learnings from campus consensus work became the source of a unique consensus turnaround model.
After detailing best practice consensus strategies from her governance work with campus principals, she poses the question: Can consensus become a lever for producing achievement results that last? MAK will be offering a workshop session later in the agenda that unpacks the turnaround consensus model in greater detail for those who are interested in implementation.
MAK Mitchell is the Executive Director of School Governance for the New York City Public Schools and President of ARMAK Associates. Previously, MAK served in Washington State as a professor and consultant of organizational change, superintendent and founder of numerous small high schools in Alaska. MAK earned both her master’s and doctoral degrees from the Harvard Graduate School of Education, and is a founding member of the Society for Organizational Learning.
Using DesCartes Instructional Ladders to Plan for Differentiated InstructionNWEA
www.earnperhit.com/essay => Professional academic writing
www.Lucky-Bet.site => Bet on Sports - 50% Deposit Bonus
www.Lucky-Bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.Lucky-Bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.Lucky-Bet.site/eurobet => Best European Bookmaker
Using DesCartes Instructional Ladders to Plan for Differentiated Instruction
Sara Reiter, Project Manager, Excellence in Instruction, Kansas Public Schools, KS., Jan Brunell, Education Research Development Council, MN
Fusion 2012, the NWEA summer conference in Portland, Oregon
In this session you will see a transformation of DesCartes to teacher-friendly instructional ladders that have promoted differentiated instruction and quality lesson planning in our district. You will also learn how we work to meet the individual needs of all learners through the use of DesCartes instructional ladders in combination with other data including: growth data, national college-readiness data, state assessment data, and formative assessment data.
Learning outcome:
- Use DesCartes Instructional Ladders with other data to promote differentiated instruction and quality lesson planning.
Audience:
- New data user
- Experienced data user
- Advanced data user
- Curriculum and Instruction
Kansas City Kansas Public Schools is an urban district serving a diverse population of twenty thousand students. We have used MAP data to differentiate instruction and encourage student growth for the past six years.
A Strategic Approach: GenAI in EducationPeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
Instructions for Submissions thorugh G- Classroom.pptxJheel Barad
This presentation provides a briefing on how to upload submissions and documents in Google Classroom. It was prepared as part of an orientation for new Sainik School in-service teacher trainees. As a training officer, my goal is to ensure that you are comfortable and proficient with this essential tool for managing assignments and fostering student engagement.
Operation “Blue Star” is the only event in the history of Independent India where the state went into war with its own people. Even after about 40 years it is not clear if it was culmination of states anger over people of the region, a political game of power or start of dictatorial chapter in the democratic setup.
The people of Punjab felt alienated from main stream due to denial of their just demands during a long democratic struggle since independence. As it happen all over the word, it led to militant struggle with great loss of lives of military, police and civilian personnel. Killing of Indira Gandhi and massacre of innocent Sikhs in Delhi and other India cities was also associated with this movement.
Synthetic Fiber Construction in lab .pptxPavel ( NSTU)
Synthetic fiber production is a fascinating and complex field that blends chemistry, engineering, and environmental science. By understanding these aspects, students can gain a comprehensive view of synthetic fiber production, its impact on society and the environment, and the potential for future innovations. Synthetic fibers play a crucial role in modern society, impacting various aspects of daily life, industry, and the environment. ynthetic fibers are integral to modern life, offering a range of benefits from cost-effectiveness and versatility to innovative applications and performance characteristics. While they pose environmental challenges, ongoing research and development aim to create more sustainable and eco-friendly alternatives. Understanding the importance of synthetic fibers helps in appreciating their role in the economy, industry, and daily life, while also emphasizing the need for sustainable practices and innovation.
The French Revolution, which began in 1789, was a period of radical social and political upheaval in France. It marked the decline of absolute monarchies, the rise of secular and democratic republics, and the eventual rise of Napoleon Bonaparte. This revolutionary period is crucial in understanding the transition from feudalism to modernity in Europe.
For more information, visit-www.vavaclasses.com
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
2. Trying to gauge my audience
and adjust my speed . . .
• How many of you think your literacy with
assessments is “Good” or better?
• How many of you have a fine tuned
assessment program?
• How many of you think your practical
knowledge about using data for systemic
improvement is “Good” or better?
3. The Bottom Line First:
Here’s what you must have
• An adequate depth of knowledge to ask really
good questions and the tenacity to do so
• A person willing to “Speak the Truth, With
Love, To Power”
• A big lever –
“What gets measured (and attended to),
gets done”
The measurement and reinforcement system
is your responsibility
4. My Purpose
• Increase your understanding about
various urgent assessment related topics
– Ask better questions
– Useful for making all types of decisions with
data
5. Three main topics
• Assessment basics +
• Improving your assessment program
• Data culture
6. Go forth thoughtfully
with care
• What we’ve known to be true is now being
shown to be true
– Using data thoughtfully improves student
achievement and growth rates
– 12% mathematics, 13% reading
• There are dangers present however
– Unintended Consequences
Slotnik, W. J. , Smith, M. D., It’s more than money, February 2013, retrieved from
http://www.ctacusa.com/PDFs/MoreThanMoney-report.pdf
7. Remember the old adage?
“What gets measured (and attended to),
gets done”
8. • NCLB
An infamous example
– Cast light on inequities
– Improved performance of “Bubble Kids”
– Narrowed taught curriculum
The same dynamic happens
inside your schools
9. A patient’s health
doesn’t change
because we know
their blood pressure
It’s our response that
makes all the
difference
It’s what we do that counts
10. Be considerate of the continuum
of stakes involved
Support
Compensate
Terminate
Increasing levels of required rigor
Increasing risk
12. Three primary conditions
1. Alignment between the content assessed and
the content to be taught
2. Selection of an appropriate assessment
• Used for the purpose for which it was designed
(proficiency vs. growth)
• Can accurately measure the knowledge of all
students
• Adequate sensitivity to growth
3. Adjust for context/control for factors outside a
teacher’s direct control (value-added)
13. Two approaches we like
1. Assessment results used
wisely as part of a
dialogue to help teachers
set and meet challenging
goals
2. Use of tests as a “yellow
light” to identify teachers
who may be in need of
additional support or are
ready for more
14. What question is being answered in support of
using data in evaluating teachers?
Is the progress produced
by this teacher
dramatically different
than teaching peers who
deliver instruction to
comparable students in
comparable situations?
16. There are four key steps required to
The Test
answer this question
Top-Down Model
The Growth Metric
The Evaluation
The Rating
17. Let’s begin at the beginning
The Test
The Growth Metric
The Evaluation
The Rating
18. What is measured should be
aligned to what is to be taught
3rd Grade
ELA
Teacher?
3rd Grade
ELA
Standards
3rd Grade
Social
Studies
Teacher?
1. Answer questions to demonstrate
understanding of text….
2. Determine the main idea of a
Elem. Art
Teacher?
text….
3. Determine the meaning of general
academic and domain specific
words…
Would you use a general
reading assessment in the
evaluation of a….
~30% of teachers teach in tested subjects and grades
The Other 69 Percent: Fairly Rewarding the Performance of Teachers of Nontested Subjects and Grades,
http://www.cecr.ed.gov/guides/other69Percent.pdf
19. What is measured should be
aligned to what is to be taught
• Assessments should align with the
teacher’s instructional responsibility
– Specific advanced content
• HS teachers teaching discipline specific content
– Especially 11th and 12th grade
• MS teachers teaching HS content to advanced students
– Non-tested subjects
• School-wide results are more likely “professional
responsibility” rather than reflecting competence
– HS teachers providing remedial services
20. The purpose and design of the
instrument is significant
• Many assessments are
not designed to
measure growth
• Others do not measure
growth equally well for
all students
21. Let’s ensure we have similar
meaning
Adult
Reading
5th Grade
x
Beginning Literacy
x
Time 1 Time 2
Status
Two assumptions:
1. Measurement accuracy,
and
2. Vertical interval scale
23. What does it take to accurately
measure achievement?
Questions
surrounding the
student’s
achievement level
The more
questions the
merrier
24. Teachers encounter a distribution
of student performance
Adult
Reading
5th
Grade
x
x
x
x
x x x
x
x
x
x
x
x
x
x
Beginning Literacy
Grade Level
Performance
31. Let’s measure height again
If I was measured as:
5’ 9”
And a year later I was:
1.82m
Did I grow?
Yes. ~ 2.5”
How do you know?
32. Traditional assessment uses items
reflecting the grade level standards
Adult
Reading
6th Grade
5th Grade
4th Grade
Grade Level Standards
Beginning Literacy
Traditional
Assessment Item Bank
33. Traditional assessment uses items
reflecting the grade level standards
Adult
Reading
6th Grade
5th Grade
4th Grade
Grade Level Standards
Grade Level Standards
Beginning Literacy
Overlap allows
linking and scale
construction
Grade Level Standards
34. Error can change your life!
• Think of a high stakes test –
State Summative
– Designed mainly to identify if a student is proficient or
not
• Do they do that well?
• 93% correct on Proficiency determination
• Does it go off design well?
• 75% correct on Performance Levels determination
*Testing: Not an Exact Science, Education Policy Brief, Delaware Education Research & Development Center, May 2004,
http://dspace.udel.edu:8080/dspace/handle/19716/244
35. Using tests in high stakes ways
creates new dynamic
• Tests specifically designed to inform classroom
instruction and school improvement in
formative ways
No incentive in the system for
inaccurate data
36. 8.00
6.00
4.00
2.00
0.00
-2.00
-4.00
-6.00
New phenomenon when used as part of
a compensation program
Mean value-added growth by school
1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51 53 55 57 59 61 63 65 67 69 71
Students taking 10+ minutes longer spring than fall All other students
37. Other consequence
When teachers are evaluated
on growth using a once per
year assessment, one teacher
who cheats disadvantages the
next teacher
38. Lessons?
• What were some things you learned?
• What practices do you want to reinforce?
• What do you need to do differently?
• Think – Pair
– 2 min to make some notes
– 3 min to share with a neighbor
39. Testing is complete . . .
What is useful to answer our question?
The Test
The Growth Metric
The Evaluation
The Rating
40. The problem with spring-spring
testing
Teacher 1 Summer Teacher 2
4/14 5/14 6/14 7/14 8/14 9/14 10/14 11/14 12/14 1/15 2/15 3/15 4/15
41. A better approach
• When possible use a spring – fall – spring
approach
• Measure summer loss and incentivize schools
and teachers to minimize it
• Measure teacher performance fall to spring,
giving as much instructional time as possible
between assessments
• Monitor testing conditions to minimize
gaming of fall or spring results
42. 100
90
80
70
60
50
40
30
20
10
0
Grade 2 Grade 3 Grade 4 Grade 5 Grade 6 Grade 7 Grade 8
Reading
Math
The metric matters -
Let’s go underneath “Proficiency”
Difficulty of New York Proficient Cut Score
National Percentile
College
Readiness
New York Linking Study: A Study of the Alignment of the NWEA RIT Scale with the New York State (NYS) Testing Program, November 2013
43. The metric matters -
Let’s go underneath “Proficiency”
Dahlin, M. and Durant, S., The State of Proficiency, Kingsbury Center at NWEA, July 2011
44. What actually happened?
Estimated Proficiency Rates For Six NY Districts
With Proficiency Cut Scores Changed
87
4th Grade Mathematics
55
100
80
60
40
20
0
-20
-40 -31
2012 2013 Reported Change
45. What actually happened?
Estimated Proficiency Rates
For Six NY Districts
4th Grade Mathematics
With Proficiency Cut Scores
87
Changed
55
100
80
60
40
20
0
-20
-40 -31
2012 2013 Reported Change
46
55
10
60
50
40
30
20
10
0
Estimated Proficiency Rates
For Six NY Districts
4th Grade Mathematics
With 2013 Proficiency Cut
Scores Applied
2012 2013 Actual Change
46. Number of Students
What gets measured and attended to
Mathematics
Fall RIT
No Change
Down
Up
really does matter
Proficiency College Readiness
One district’s change in 5th grade mathematics performance
relative to the KY proficiency cut scores
47. Number of Students
Changing from Proficiency to Growth
means all kids matter
Mathematics
Student’s score in fall
Below projected
growth
Met or above
projected growth
Number of 5th grade students meeting projected
mathematics growth in the same district
48. Lessons?
• What were some things you learned?
• What practices do you want to reinforce?
• What do you need to do differently?
• Think – Pair – Share
– 2 min to make some notes
– 3 min to share with a neighbor
49. How can we make it fair?
The Test
The Growth Metric
The Evaluation
The Rating
50. Without context what is
“Good”?
Scale Norms Study
Adult
Literacy
Beginning
Reading
National
Percentile
College Readiness
ACT
Benchmarks
State Test
Performance Levels
“Meets”
Proficiency
Common
Core
Proficient
Performance Levels
51. Normative data for growth is a
bit different
Fall
Score
Basic
Factors
Grade: 5th
Subject:
Reading
7
points
FRL vs. non-FRL?
IEP vs. non-IEP?
ESL vs. non-ESL?
Starting
Achievement
Instructional
Weeks
Typical growth
Outside of a teacher’s direct control
52. A Visual Representation of
Value Added
Spring 5th Grade
Test
Student A
Spring Score 209
Score 207
(Average Spring Score for Similar
Students)
Value Added
(+2 Score)
Student A
Fall Score 200
Fall 5th Grade
Test
53. Consider . . .
• What if I skip this step?
– Comparison is likely against normative data
so the comparison is to “typical kids in
typical settings”
• How fair is it to disregard context?
– Good teacher – bad school
– Good teacher – challenging kids
54. Value-added is science
• Value added models can control for a variety
of classroom, school level, and other
conditions
– Proven statistical methods
– All attempt to minimize error
– Variables outside controls are assumed as random
55. A variety of errors means more
stability only at the extremes
• Control for measurement
error
– All models attempt to address
this issue
• Population size
• Multiple data points
– Error is compounded with
combining two test events
– Many teachers’ value-added
scores will fall within the range
of statistical error
56. 12.00
11.00
10.00
9.00
8.00
7.00
6.00
5.00
4.00
3.00
2.00
1.00
0.00
-1.00
-2.00
-3.00
-4.00
-5.00
-6.00
-7.00
-8.00
-9.00
-10.00
-11.00
-12.00
Average Growth Index Score and Range
Range of teacher value-added
estimates
Mathematics Growth Index Distribution by Teacher - Validity Filtered
Q5
Q4
Q3
Q2
Q1
Each line in this display represents a single teacher. The graphic
shows the average growth index score for each teacher (green
line), plus or minus the standard error of the growth index estimate
(blue line). We removed students who had tests of questionable
validity and teachers with fewer than 20 students.
58. Assumption of randomness can
have risk implications
• Value-added models assume that variation is
caused by randomness if not controlled for
explicitly
– Young teachers are assigned disproportionate
numbers of students with poor discipline records
– Parent requests for the “best” teachers are
honored
• Sound educational reasons for placement are
likely to be defensible
59. Instability at the tails of the
distribution
“The findings indicate that these modeling
choices can significantly influence outcomes for
individual teachers, particularly those in the tails
of the performance distribution who are most
likely to be targeted by high-stakes policies.”
Ballou, D., Mokher, C. and Cavalluzzo, L. (2012) Using Value-Added Assessment for Personnel
Decisions: How Omitted Variables and Model Specification Influence Teachers’ Outcomes.
LA Times Teacher #1
LA Times Teacher #2
60. How tests are used to evaluate
teachers
The Test
The Growth Metric
The Evaluation
The Rating
61. Translation into ratings can be
difficult to inform with data
• How would you
translate a rank order
to a rating?
• Data can be provided
• Value judgment
ultimately the basis
for setting cut scores
for points or rating
62. Decisions are value based,
not empirical
• What is far below a
district’s expectation is
subjective
• What about
• Obligation to help
teachers improve?
• Quality of replacement
teachers?
63. Even multiple measures need to
be used well
• System for combining elements and
producing a rating is also a value based
decision
– Multiple measures and principal judgment
must be included
– Evaluate the extremes to make sure it
makes sense
64. Leadership Courage Is A Key
5
4
3
2
1
0
Ratings can be driven by the assessment
Teacher 1 Teacher 2 Teacher 3
Observation Assessment
Real
or
Noise?
65. Big Message
If evaluators do not differentiate
their ratings,
then all differentiation comes from
the test
66. Lessons?
• What were some things you learned?
• What practices do you want to reinforce?
• What do you need to do differently?
• Think – Pair – Share
– 2 min to make some notes
– 3 min to share with a neighbor
– 2 min for two report outs on anything so far
68. • Read the sheet
and highlight
anything
interesting to
you
Let’s Define
Types of Assessments
69. The pursuit of compliance is exhausting
because it is always a moving target.
Governors move on, the party in power
gets replaced, a new president is elected,
and all want to put their own stamp on
education.
It is saner and less exhausting to define
your own course and align compliance
requirements to that.
76. The metrics and incentives used
encourage a focus on all learners 6
77. The assessment program contributes
to a climate of transparency and
objectivity with a long-term focus 7
78. 1. Typical assessment
purposes
• Identify student learning needs
• Identify groupings of students for instruction
• Guide instruction
• Course placement
• Determine eligibility for programs
• Award credits and/or assign grades
• Evaluate proficiency
• Monitor student progress
• Predict proficiency
• Project achievement of a goal
• Formative and summative evaluation of programs
• Formative evaluation to support school and teacher improvement
• Report student achievement, growth, and progress to the
community and stakeholders
• Summative evaluation of schools and teachers
79. 1. Assessment Purpose Survey
To increase value…
Identify gaps between:
1. How critical is this data
to your work?
2. How do you actually
use this data?
Take 10 min to fill this
out and 5 min to pair and
discuss areas of biggest
gap
80. Compare
assessments and
their purposes to
find unnecessary
overlaps
Take 10 min to
fill this out and 5
min to pair and
discuss areas of
redundancy
3. Eliminate waste
81. Lessons?
• What were some things you learned?
• What practices do you want to reinforce?
• What do you need to do differently?
• Think – Pair – Share
– 2 min to make some notes
– 3 min to share with a neighbor
– 2 min for two report outs on this section
83. Education Organizations
Mature
Poor to
Fair
Fair to
Good
Good to
Great
Great to
Excellent
Achieving the
basics of literacy
and numeracy
Getting the
foundations
in place
Shaping the
professional
Improving
through
peers and
innovation
Data use does too
Barber, M., Chijioke, C., & Mourshed, M. (2011). How the world’s most improved school systems keep getting better. McKinsey & Company.
84. Data Use Continuum
Poor to
Fair
Fair to
Good
Good to
Great
Great to
Excellent
One on One
Within
Teams
Within
the Walls
Across the
Walls
Requires a shift in the culture
86. Reflection Time
• Where are your pockets of most maturity?
• Least maturity?
• What is causing the differences?
Think - 2 min
87. Research on data use in
school improvement
• Education problems are “Wicked”
– Problem boundaries are ill-defined
– No definitive solutions
– Highly resistant to change
– Problem and solutions depend on
perspective
– Changes are consequential
Data can only take you so far
88. Research on data use in
school improvement
• Use data as a platform for deeper
conversations
• Define your problem well
– Problem title and description
– Magnitude
– Location
– Duration
89. Research on data use in
school improvement
• Part of a continuous improvement
process
–Data conversations
• Collaborative
• Embedded in culture
• Structured process
• Love, Nancy – Using data to improve learning for all
• Lipton & Wellman – Got data? Now what?
• NWEA Data Coaching
Teacher evaluations and the use of data in them can take many forms. You can use them for supporting teachers and their improvement. You can use the evaluations to compensate teachers or groups of teachers differently or you can use them in their highest stakes way to terminate teachers.
The higher the stakes put on the evaluation, the more risk there is to you and your organization from a political, legal, and equity perspective. Most people naturally respond with increasing the levels of rigor put into designing the process as a way to ameliorate the risk. One fact is that the risk can’t be eliminated.
Our goal – Make sure you are prepared. Understand the risk. Proper ways to implement including legal issues. Clarify some of the implications – Very complex – Prepare you and a prudent course
This is the value added metric
Not easy to make nuanced decisions. Can learn about the ends.
Contrast with what value added communicates
Plot normal growth for Marcus vs anticipated growth – value added. If you ask whether the teachers provided value added, the answer is Yes.
Other line is what is needed for college readiness
Blue line is what is used to evaluate the teacher.
Is he on the line the parents want him to be on? Probably not.
Don’t focus on one at the expense of the other
NCLB – AYP vs what the parent really wants for goal setting
Can be come so focused on measuring teachers that we lose sight of what parents value
We are better off moving towards the kids aspirations
As a parent I didn’t care if the school made AYP. I cared if my kids got the courses that helped them go where they want to go.
Steps are quite important. People tend to skip some of these.
Kids take a test – important that the test is aligned to instruction being given
Metric – look at growth vs growth norm and calculate a growth index. Two benefits – Very transparent/Simple.
People tend to use our growth norms – if you hit 60% for a grade level within a school you are dong well.
Norms – growth of a kid or group of kids compared to a nationally representative sample of students
Why isn’t this value added?
Not all teachers can be compared to a nationally representative sample because they don’t teach kids that are just like the national sample
The third step controls for variables unique to the teacher’s classroom or environment
Fourth step – rating – how much below average before the district takes action or how much above before someone gets performance pay. Particular challenge in NY state right now. Law requires it.
Steps are quite important. People tend to skip some of these.
Kids take a test – important that the test is aligned to instruction being given
Metric – look at growth vs growth norm and calculate a growth index. Two benefits – Very transparent/Simple.
People tend to use our growth norms – if you hit 60% for a grade level within a school you are dong well.
Norms – growth of a kid or group of kids compared to a nationally representative sample of students
Why isn’t this value added?
Not all teachers can be compared to a nationally representative sample because they don’t teach kids that are just like the national sample
The third step controls for variables unique to the teacher’s classroom or environment
Fourth step – rating – how much below average before the district takes action or how much above before someone gets performance pay. Particular challenge in NY state right now. Law requires it.
Common core – very ambitious things they want to measure – tackle things on an AP test. Write and show their work.
A CC assessment to evaluate teachers can be a problem.
Raise your hand if you know what the capital of Chile is. Santiago. Repeat after me. We will review in a couple of minutes. Facts can be relatively easily acquired and are instructionally sensitive. If you expose kids to facts in a meaningful and engaging ways, it is sensitive to instruction.
State assessment designed to measure proficiency – many items in the middle not at the ends
Must use multiple points of data over time to measure this.
We also believe that a principal should be more in control of the evaluation than the test – Principal and Teacher leaders are what changes schools
5th grade NY reading cut scores shown
Steps are quite important. People tend to skip some of these.
Kids take a test – important that the test is aligned to instruction being given
Metric – look at growth vs growth norm and calculate a growth index. Two benefits – Very transparent/Simple.
People tend to use our growth norms – if you hit 60% for a grade level within a school you are dong well.
Norms – growth of a kid or group of kids compared to a nationally representative sample of students
Why isn’t this value added?
Not all teachers can be compared to a nationally representative sample because they don’t teach kids that are just like the national sample
The third step controls for variables unique to the teacher’s classroom or environment
Fourth step – rating – how much below average before the district takes action or how much above before someone gets performance pay. Particular challenge in NY state right now. Law requires it.
NCLB required everyone to get above proficient – message focus on kids at or near proficient
School systems responded
MS standards are harder than the elem standards – MS problem
No effort to calibrate them – no effort to project elem to ms standards
Start easy and ramp up.
Proficient in elem and not in MS with normal growth.
When you control for the difficulty in the standards Elem and MS performance are the same
Not only are standards different across grades, they are different across states.
It’s data like this that helps to inspire the Common Core and consistent standards so we compare apples to apples
Dramatic differences between standards based vs growth
KY 5th grade mathematics
Sample of students from a large school system
X-axis Fall score, Y number of kids
Blue are the kids who did not change status between the fall and the spring on the state test
Red are the kids who declined in performance over spring – Decender
Green are kids who moved above it in performance over the spring – Ascender – Bubble kids
About 10% based on the total number of kids
Accountability plans are made typically based on these red and green kids
Same district as before
Yellow – did not meet target growth – spread over the entire range of kids
Green – did meet growth targets
60% vs 40% is doing well – This is a high performing district with high growth
Must attend to all kids – this is a good thing – ones in the middle and at both extremes
Old one was discriminatory – focus on some in lieu of others
Teachers who teach really hard at the standard for years – Teachers need to be able to reach them all
This does a lot to move the accountability system to parents and our desires.
Be ready to coach the media – Need to be assessment literate
Importance of longitudinal data systems and consistent measurement – Lots of impacts – Program evaluation for example
There is a huge difference between achievement (and therefore growth) and proficiency
Steps are quite important. People tend to skip some of these.
Kids take a test – important that the test is aligned to instruction being given
Metric – look at growth vs growth norm and calculate a growth index. Two benefits – Very transparent/Simple.
People tend to use our growth norms – if you hit 60% for a grade level within a school you are dong well.
Norms – growth of a kid or group of kids compared to a nationally representative sample of students
Why isn’t this value added?
Not all teachers can be compared to a nationally representative sample because they don’t teach kids that are just like the national sample
The third step controls for variables unique to the teacher’s classroom or environment
Fourth step – rating – how much below average before the district takes action or how much above before someone gets performance pay. Particular challenge in NY state right now. Law requires it.
Close by noting that NWEA recognized the need for this level of precision when trying to understand student performance (and by extension, teacher performance). This is why, in NY (where we first began having this conversation with partners), we sought to partner with VARC, because of their background and experience providing these services (and because this is something that we did not want to do, even if we had the background/experience).
Talk about the number of districts and students in 11-12 and 12-13, to provide context for the ability for this to be done on a broad scale.
There are wonderful teachers who teach in very challenging, dysfunctional settings. The setting can impact the growth. HLM embeds the student in a classroom, the classroom in the school, and controls for the school parameters. Is it perfect. No. Is it better? Yes.
Opposite is true and learning can be magnified as well.
What if kids are a challenge, ESL or attendance for instance. It can deflate scores especially with a low number of kids in the sample being analyzed. Also need to make sure you have a large enough ‘n’ to make this possible especially true in small districts.
Our position is that a test can inform the decision, but the principal/administrator should collect the bulk of the data that is used in the performance evaluation process.
Experts recommend multiple years of data to do the evaluation. Invalid to just use two points and will testify to it.
Principals never fire anyone – NY rubber room – myth
If they do, it’s not fast enough. – Need to speed up the process
This won’t make the process faster – Principals doing intense evaluations will
Measurement error is compounded in test 1 and test 2
Green line is their VA estimate and bar is the error of measure
Both on top and bottom people can be in other quartiles
People in the middle can cross quintiles – just based on SEM
Cross country – winners spread out. End of the race spread. Middle you get a pack. Middle moving up makes a big difference in the overall race.
Instability and narrowness of ranges means evaluating teachers in the middle of the test mean slight changes in performance can be a large change in performance ranking
Non –random assignments
Models control for various things – FRL, ethnicity, school effectiveness overall. Beyond this point assignment is random.
1st year teachers get more discipline problems than teachers who have been 30 years. Pick the kids they get. If the model doesn’t control for disciplinary record – none do have that data – scores are inflated. Makes model invalid.
Principals do need to do non-random assignment – sound educational reasons for the placement – match adults for kids
Steps are quite important. People tend to skip some of these.
Kids take a test – important that the test is aligned to instruction being given
Metric – look at growth vs growth norm and calculate a growth index. Two benefits – Very transparent/Simple.
People tend to use our growth norms – if you hit 60% for a grade level within a school you are dong well.
Norms – growth of a kid or group of kids compared to a nationally representative sample of students
Why isn’t this value added?
Not all teachers can be compared to a nationally representative sample because they don’t teach kids that are just like the national sample
The third step controls for variables unique to the teacher’s classroom or environment
Fourth step – rating – how much below average before the district takes action or how much above before someone gets performance pay. Particular challenge in NY state right now. Law requires it.
Use NY point system as the example
Be ready to coach the media – Need to be assessment literate
Importance of longitudinal data systems and consistent measurement – Lots of impacts – Program evaluation for example
There is a huge difference between achievement (and therefore growth) and proficiency