SlideShare a Scribd company logo
Andy Hegedus, Ed. D.
Kingsbury Center at NWEA
September 2013
Measuring student
growth accurately – It
makes a difference in
your world!
• Goal is to improve student achievement
through improving work force performance
over time
– Just like any profession there is variability in the
performance
• Belief system driving policy
– Rigorous performance evaluation process, and the
rewards, support, or removal of teachers that
comes with it, is a major lever
Overview/Setting the stage
Evaluator Rating
Ineffective
Developing
Effective
Highly Effective
What is happening just can’t be
right!
5800 teachers evaluated between January and May 2012, The Atlanta Journal-Constitution January 7, 2013
“Statistically, this flies in the face of our academic achievement levels. These
numbers just doesn’t jibe with reality,” Millar said. “If the Georgia evaluation
system is going to be based on these type of statistics, I wouldn’t see us going
forward with it because, just statistically, it can’t be valid.
Focus should likely be
elsewhere (on the 99%)
Executive Brief: Tracking Trends in Employee Turnover, Retrieved March
11, 2013, http://www.shrm.org/research/benchmarks/documents/trends%20in%20turnover_final.pdf
Remaining Workforce:
• Effectiveness of surrounding system
• Powerful Professional Development
• Performance management system
explicitly designed to improve
performance
Voluntary Turnover:
• Working conditions
• Induction and support
Involuntary Turnover:
• Financial stability
• Keep the best
Remaining
85%
Voluntary
9%
Involuntary
6%
2011 Percentage
• Increase your understanding about various
urgent assessment related topics
– Ask better questions
– Useful for making all types of decisions with data
• Follow along and ask questions at any time
– Slideshare.net
• Will pause during transitions for you to discuss
“Ah-Ha’s” with a neighbor
My Purpose
1. Selection of an appropriate test:
• Used for the purpose for which it was designed
(proficiency vs. growth)
• Can accurately measure the test performance of all
students
2. Alignment between the content assessed and
the content to be taught
3. Adjust for context/control for factors outside a
teacher’s direct control (value-added)
Three primary conditions for using
tests for teacher evaluation
1. Evaluation process that focuses on helping
teachers improve
2. The principal or designated evaluator should
control the evaluation
3. Tests should inform the process, not dictate
or decide it
4. Multiple measures should be used over time
What NWEA supports
1. Use of tests as part of a
dialogue to help teachers
set improvement goals
2. Use of tests as a “yellow
light” to identify teachers
who may be in need of
additional support or
assistance
Two approaches we like
• What we’ve known to be true is now being
shown to be true
– Using data thoughtfully improves student
achievement
– 12% mathematics, 13% reading
• There are dangers present however
– Unintended Consequences
Go forth thoughtfully
with care
Slotnik, W. J. , Smith, M. D., It’s more than money, February 2013, retrieved from
http://www.ctacusa.com/PDFs/MoreThanMoney-report.pdf
“What gets measured (and attended to),
gets done”
Remember the old adage?
• NCLB
–Cast light on inequities
–Improved performance of “Bubble Kids”
–Narrowed taught curriculum
An infamous example
It’s what we do that counts
A patient’s health
doesn’t change
because we know
their blood pressure
It’s our response that
makes all the
difference
1. Shifting towards tighter state level
control – a shift of decision-making
away from local control
2. Our nation moved from a model of
education reform that focused on fixing
schools to a model that is focused on
fixing the teaching profession
Policy shifts make today’s
conversation inevitable
Be considerate of the continuum of
stakes involved
Support
Compensate
Terminate
Increasing levels of required rigor
Increasingrisk
The use of value-added data for high stakes
personnel decisions does not yet have a
strong, coherent, body of case law.
Expect litigation if value-added results are the
lynchpin evidence for a teacher-dismissal case
until a body of case law is established.
• Due Process
• Disparate impact doctrine
Potential Litigation Issues
Baker B., Oluwole, J., Green, P. (2013). The legal
consequences of mandating high stakes
decisions based on low quality information:
Teacher evaluation in the Race to the Top Era.
Education Policy Analysis Archives. Vol 21. No
5.
Suggested reading
Is the progress produced
by this teacher
dramatically different
than teaching peers who
deliver instruction to
comparable students in
comparable situations?
What question is being answered in support of
using data in evaluating teachers?
Marcus Normal Growth Needed Growth
Marcus’ growth
College readiness standard
The Test
The Growth Metric
The Evaluation
The Rating
There are four key steps required to
answer this question
Top-Down Model
Assessment 1
Goal Setting
Assessment(s)
Results and Analysis
Evaluation (Rating)
How does the other popular process
work?
Bottom-Up Model
(Student Learning Objectives)
Understanding
all four of the
top-down
elements are
needed here
The Test
The Growth Metric
The Evaluation
The Rating
Let’s begin at the beginning
3rd Grade
ELA
Standards
3rd Grade
ELA
Teacher?
3rd Grade
Social
Studies
Teacher?
Elem. Art
Teacher?
What is measured should be
aligned to what is to be taught
1. Answer questions to demonstrate
understanding of text….
2. Determine the main idea of a
text….
3. Determine the meaning of general
academic and domain specific
words…
Would you use MAP in the
evaluation of a….
~30% of teachers teach in tested subjects and grades
The Other 69 Percent: Fairly Rewarding the Performance of Teachers of Nontested Subjects and
Grades, http://www.cecr.ed.gov/guides/other69Percent.pdf
• Assessments should align with the
teacher’s instructional responsibility
– Specific advanced content
• HS teachers teaching discipline specific content
– Especially 11th and 12th grade
• MS teachers teaching HS content to advanced students
– Non-tested subjects
• School-wide results are more likely “professional
responsibility” rather than reflecting competence
– HS teachers providing remedial services
What is measured should be
aligned to what is to be taught
• Many assessments are
not designed to
measure growth
• Others do not measure
growth equally well for
all students
The purpose and design of the
instrument is significant
Both status and growth are
important but growth leads
Beginning Literacy
Adult
Reading
5th Grade
x
x
Time 1 Time 2
Status
Two assumptions:
1. Measurement
accuracy, and
2. Vertical scale
Accurately measuring growth
depends on accurately measuring
achievement
How about measuring
height?
What if the pencil isn’t
very level?
What if we marked with
sidewalk chalk?
Measurement Accuracy
A test for you
Beginning Literacy
Adult
Reading
5th Grade
x
x
Time 1 Time 2
Pop Quiz:
What’s bigger?
1. Time 1 Error or Time 2 Error alone
2. Time 2 minus Time 1 Error (Growth)
Questions surrounding the
student’s achievement level
The more questions the merrier
What does it take to accurately
measure achievement?
Teachers encounter a distribution
of student performance
Beginning Literacy
Adult
Reading
5th
Grade
x x x
x
x
x
x
x
x
x
x
x
x
x
x
Grade Level
Performance
Adaptive testing works differently
Item bank can
span full
range of
achievement
Items available need to match student
ability
California STAR NWEA MAP
5th Grade
Level Items
These differences impact
measurement error
.00
.02
.04
.06
.08
.10
.12
Information
170 180 190 200 210 220 230 240
Scale Score
Pass/
Proficient
Fully
Adaptive Test
Significantly
Different Error
26th
Fail/Basic Pass/Advanced
77th
160
Constrained
Adaptive or
Paper/Pencil
Test
To determine growth, achievement
measurements must be related
through a scale
If I was measured as:
5’ 9”
And a year later I was:
1.82m
Did I grow?
Yes. ~ 2.5”
How do you know?
Let’s measure height again
Traditional assessment uses items
reflecting the grade level standards
Beginning Literacy
Adult
Reading
4th Grade
5th Grade
6th Grade
Grade Level Standards
Traditional
Assessment Item Bank
Traditional assessment uses items
reflecting the grade level standards
Beginning Literacy
Adult
Reading
4th Grade
5th Grade
6th Grade
Grade Level Standards
Grade Level Standards
Overlap allows
linking and scale
construction
Grade Level Standards
• Study on impact of assessment
selection on VAM results
–Defined a misidentified teacher as one
who appeared to have growth which
was incorrect by more than one-half a
year1
• Less than .5 years or more than 1.5 years
Error can change your life!!!
1Woodworth, J.L., Does Assessment Selection Matter When Computing Teacher Value-Added
Measures?, http://www.kingsburycenter.org/sites/default/files/James%20Woodworth%20Data%20Award%20Research%20Brief.pdf
• “. . . in the 25 student (single class)
simulations. At the 25 student level, the VAM
based on the TAKS misidentifies 35% of all
teachers, whereas, the VAM based on the
MAP misidentifies only 1% of teachers.”
Initial measurement error is a significant issue in
AYP and Teacher Evaluation work
Error can change your life!!!
Black, P. and Wiliam, D.(2007) 'Large-scale assessment systems: Design principles
drawn from international comparisons', Measurement: Interdisciplinary Research &
Perspective, 5: 1, 1 — 53
• …when science is defined in terms of
knowledge of facts that are taught in
school…(then) those students who have been
taught the facts will know them, and those
who have not will…not. A test that assesses
these skills is likely to be highly sensitive to
instruction.
The instrument must be able to
detect instruction
Black, P. and Wiliam, D.(2007) 'Large-scale assessment systems: Design principles
drawn from international comparisons', Measurement: Interdisciplinary Research &
Perspective, 5: 1, 1 — 53
• When ability in science is defined in terms of
scientific reasoning…achievement will be less
closely tied to age and exposure, and more
closely related to general intelligence. In
other words, science reasoning tasks are
relatively insensitive to instruction.
The more complex, the harder to
detect and attribute to one teacher
• Tests specifically designed to inform classroom
instruction and school improvement in
formative ways
No incentive in the system for
inaccurate data
Using tests in high stakes ways
creates new dynamic
-6.00
-4.00
-2.00
0.00
2.00
4.00
6.00
8.00
1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51 53 55 57 59 61 63 65 67 69 71
Students taking 10+ minutes longer spring than fall All other students
New phenomenon when used as part of
a compensation program
Mean value-added growth by school
Cheating
Atlanta Public Schools
Crescendo Charter Schools
Philadelphia Public Schools
Washington DC Public Schools
Houston Independent School
District
Michigan Public Schools
When teachers are evaluated
on growth using a once per
year assessment, one teacher
who cheats disadvantages the
next teacher
Other consequence
Other issues
Proctoring
Proctoring both with and without the
classroom teacher raises possible
problems
Documentation that test
administration procedures were
properly followed is important
Monitoring testing conditions assists
with reliability
Testing is complete . . .
What is useful to answer our question?
The Test
The Growth Metric
The Evaluation
The Rating
The problem with spring-spring
testing
3/11 4/11 5/11 6/11 7/11 8/11 9/11 10/11 11/11 12/11 1/12 2/12 3/12
Teacher 1 Summer Teacher 2
• When possible use a spring – fall – spring
approach
• Measure summer loss and incentivize schools
and teachers to minimize it
• Measure teacher performance fall to
spring, giving as much instructional time as
possible between assessments
• Monitor testing conditions to minimize
gaming of fall spring results
A better approach
Without context what is
“Good”?
Beginning
Reading
Adult
Literacy
RIT
National
Percentile
NWEA Norms
Study
NWEA
Scale
CollegeReadiness
Benchmarks
ACT
PerformanceLevels
State Test
“Meets”
Proficiency
PerformanceLevels
Common
Core
Proficient
0
10
20
30
40
50
60
70
80
90
100
Grade 2 Grade 3 Grade 4 Grade 5 Grade 6 Grade 7 Grade 8
Reading
Math
The metric matters -
Let’s go underneath “Proficiency”
Difficulty of Virginia SOL Pass/Proficient Cut Score
NationalPercentile
College
Readiness
A study of the alignment of the NWEA RIT scale with the Virginia Standards of Learning (SOL), December 2012
Difficulty of ACT college readiness
standards
The metric matters -
Let’s go underneath “Proficiency”
Dahlin, M. and Durant, S., The State of Proficiency, Kingsbury Center at NWEA, July 2011
NumberofStudents
Fall RIT
Mathematics
No Change
Down
Up
What gets measured and attended to
really does matter
Proficiency College Readiness
One district’s change in 5th grade mathematics performance
relative to the KY proficiency cut scores
NumberofStudents
Student’s score in fall
Mathematics
Below projected
growth
Met or above
projected growth
Number of 5th grade students meeting projected
mathematics growth in the same district
Changing from Proficiency to Growth
means all kids matter
How can we make it fair?
The Test
The Growth Metric
The Evaluation
The Rating
Context – 2011 NWEA Student
Norms
Starting
Score:
200
Fall RIT
Score
Subject:
Reading
Grade: 4th
7
RIT
FRL vs. non-FRL?
IEP vs. non-IEP?
ESL vs. non-ESL?
Outside of a teacher’s direct control
A Visual Representation of
Value Added
Spring 4th Grade
MAP Test
Student A
Spring RIT Score 209
RIT Score 207
(Average Spring Score for Similar
Students)
Value Added
(+2 RIT Score)
Student A
Fall RIT Score 200
Fall 4th Grade
MAP Test
• What if I skip this step?
– Comparison is likely against normative data so the
comparison is to “typical kids in typical settings”
• How fair is it to disregard context?
– Good teacher – bad school
– Good teacher – challenging kids
Does your personal goal setting consider
context?
Consider . . .
• Lack of a historical context
– What has this teacher and these students done in
the past?
• Lack of comparison groups
– What have other teachers done in the past?
• What is the objective?
– Is the objective to meet a standard of
performance or demonstrate improvement?
• Do you set safe goals or stretch goals?
Challenges with goal setting
• Value added models control for a variety of
classroom, school level, and other conditions
– Proven statistical methods
– All attempt to minimize error
– Variables outside controls are assumed as random
Value-added is science
• Control for measurement
error
– All models attempt to address
this issue
• Population size
• Multiple data points
– Error is compounded with
combining two test events
– Nevertheless, many teachers’
value-added scores will fall
within the range of statistical
error
A variety of errors means more
stability only at the extremes
-12.00
-11.00
-10.00
-9.00
-8.00
-7.00
-6.00
-5.00
-4.00
-3.00
-2.00
-1.00
0.00
1.00
2.00
3.00
4.00
5.00
6.00
7.00
8.00
9.00
10.00
11.00
12.00
AverageGrowthIndexScoreandRange
Mathematics Growth Index Distribution by Teacher - Validity Filtered
Q5
Q4
Q3
Q2
Q1
Each line in this display represents a single teacher. The graphic
shows the average growth index score for each teacher (green
line), plus or minus the standard error of the growth index estimate
(black line). We removed students who had tests of questionable
validity and teachers with fewer than 20 students.
Range of teacher value-added
estimates
With one teacher,
error means a lot
• Value-added models assume that variation is
caused by randomness if not controlled for
explicitly
– Young teachers are assigned disproportionate
numbers of students with poor discipline records
– Parent requests for the “best” teachers are
honored
• Sound educational reasons for placement are
likely to be defensible
Assumption of randomness can
have risk implications
“The findings indicate that these modeling
choices can significantly influence outcomes for
individual teachers, particularly those in the tails
of the performance distribution who are most
likely to be targeted by high-stakes policies.”
Ballou, D., Mokher, C. and Cavalluzzo, L. (2012) Using Value-Added Assessment for Personnel
Decisions: How Omitted Variables and Model Specification Influence Teachers’ Outcomes.
Instability at the tails of the
distribution
LA Times Teacher #1
LA Times Teacher #2
How tests are used to evaluate
teachers
The Test
The Growth Metric
The Evaluation
The Rating
• How would you
translate a rank order
to a rating?
• Data can be provided
• Value judgment
ultimately the basis
for setting cut scores
for points or rating
Translation into ratings can be
difficult to inform with data
• What is far below a
district’s expectation is
subjective
• What about
• Obligation to help
teachers improve?
• Quality of replacement
teachers?
Decisions are value based,
not empirical
• System for combining elements and
producing a rating is also a value based
decision
–Multiple measures and principal judgment
must be included
–Evaluate the extremes to make sure it
makes sense
Even multiple measures need to
be used well
Evaluator Rating
Ineffective
Developing
Effective
Highly Effective
Remember this?
5800 teachers evaluated between January and May 2012, The Atlanta Journal-Constitution January 7, 2013
Leadership Courage Is A Key
0
1
2
3
4
5
Teacher 1 Teacher 2 Teacher 3
Ratings can be driven by the assessment
Observation Assessment
Real
or
Noise?
If evaluators do not differentiate
their ratings,
then all differentiation comes from
the test
Big Message
1. Selection of an appropriate test:
• Used for the purpose for which it was designed
(proficiency vs. growth)
• Can accurately measure the test performance of all
students
2. Alignment between the content assessed and
the content to be taught
3. Need for context for growth/control for factors
outside a teacher’s direct control (value-added)
Please be thoughtful about . . .
• Presentations and other recommended
resources are available at:
– www.nwea.org
– www.kingsburycenter.org
– Slideshare.net
• Contacting us:
NWEA Main Number
503-624-1951
E-mail: andy.hegedus@nwea.org
More information

More Related Content

What's hot

ΟΔΥΣΣΕΙΑ Ζ : ΝΑΥΣΙΚΑ
ΟΔΥΣΣΕΙΑ Ζ : ΝΑΥΣΙΚΑΟΔΥΣΣΕΙΑ Ζ : ΝΑΥΣΙΚΑ
ΟΔΥΣΣΕΙΑ Ζ : ΝΑΥΣΙΚΑ
ΕΛΕΝΗ ΜΟΥΤΑΦΗ
 
ο γλάρος άντον τσέχωφ
ο γλάρος   άντον τσέχωφο γλάρος   άντον τσέχωφ
ο γλάρος άντον τσέχωφ
Anonimos Ellinas
 
ΠΕΛΙΑΣ ΚΑΙ ΙΑΣΟΝΑΣ-ΙΣΤΟΡΙΑ Γ ΤΑΞΗΣ
ΠΕΛΙΑΣ ΚΑΙ ΙΑΣΟΝΑΣ-ΙΣΤΟΡΙΑ Γ ΤΑΞΗΣΠΕΛΙΑΣ ΚΑΙ ΙΑΣΟΝΑΣ-ΙΣΤΟΡΙΑ Γ ΤΑΞΗΣ
ΠΕΛΙΑΣ ΚΑΙ ΙΑΣΟΝΑΣ-ΙΣΤΟΡΙΑ Γ ΤΑΞΗΣMaria Froudaraki
 
Εικονογραφικοί τύποι της Θεοτόκου
Εικονογραφικοί τύποι της ΘεοτόκουΕικονογραφικοί τύποι της Θεοτόκου
Εικονογραφικοί τύποι της Θεοτόκου
Δήμητρα Τζίνου
 
''το σταφύλι ''
''το σταφύλι ''''το σταφύλι ''
''το σταφύλι ''
kvaggelio25
 
Κλέφτικο τραγούδι: Του Βασίλη
Κλέφτικο τραγούδι: Του ΒασίληΚλέφτικο τραγούδι: Του Βασίλη
Κλέφτικο τραγούδι: Του Βασίλη
JoannaArtinou
 
Η ΑΝΑΛΗΨΗ ΤΟΥ ΧΡΙΣΤΟΥ-ΘΡΗΣΚΕΥΤΙΚΑ Γ ΤΑΞΗΣ
Η ΑΝΑΛΗΨΗ ΤΟΥ ΧΡΙΣΤΟΥ-ΘΡΗΣΚΕΥΤΙΚΑ Γ ΤΑΞΗΣΗ ΑΝΑΛΗΨΗ ΤΟΥ ΧΡΙΣΤΟΥ-ΘΡΗΣΚΕΥΤΙΚΑ Γ ΤΑΞΗΣ
Η ΑΝΑΛΗΨΗ ΤΟΥ ΧΡΙΣΤΟΥ-ΘΡΗΣΚΕΥΤΙΚΑ Γ ΤΑΞΗΣMaria Froudaraki
 
ΙΛΙΑΔΑ Τ - ΜΗΝΙΔΟΣ ΑΠΟΡΡΗΣΙΣ ( ΣΥΜΦΙΛΙΩΣΗ ΑΧΙΛΛΕΑ-ΑΓΑΜΕΜΝΟΝΑ)
ΙΛΙΑΔΑ Τ - ΜΗΝΙΔΟΣ ΑΠΟΡΡΗΣΙΣ ( ΣΥΜΦΙΛΙΩΣΗ ΑΧΙΛΛΕΑ-ΑΓΑΜΕΜΝΟΝΑ)ΙΛΙΑΔΑ Τ - ΜΗΝΙΔΟΣ ΑΠΟΡΡΗΣΙΣ ( ΣΥΜΦΙΛΙΩΣΗ ΑΧΙΛΛΕΑ-ΑΓΑΜΕΜΝΟΝΑ)
ΙΛΙΑΔΑ Τ - ΜΗΝΙΔΟΣ ΑΠΟΡΡΗΣΙΣ ( ΣΥΜΦΙΛΙΩΣΗ ΑΧΙΛΛΕΑ-ΑΓΑΜΕΜΝΟΝΑ)
ΕΛΕΝΗ ΜΟΥΤΑΦΗ
 
ΤΟ ΛΙΟΝΤΑΡΙ ΤΗΣ ΝΕΜΕΑΣ-ΙΣΤΟΡΙΑ Γ ΤΑΞΗΣ
ΤΟ ΛΙΟΝΤΑΡΙ ΤΗΣ ΝΕΜΕΑΣ-ΙΣΤΟΡΙΑ Γ ΤΑΞΗΣΤΟ ΛΙΟΝΤΑΡΙ ΤΗΣ ΝΕΜΕΑΣ-ΙΣΤΟΡΙΑ Γ ΤΑΞΗΣ
ΤΟ ΛΙΟΝΤΑΡΙ ΤΗΣ ΝΕΜΕΑΣ-ΙΣΤΟΡΙΑ Γ ΤΑΞΗΣMaria Froudaraki
 
Συστατική επιστολή
Συστατική επιστολήΣυστατική επιστολή
Συστατική επιστολήMiltiadis Petridis
 
ΦΥΛΛΟ ΕΡΓΑΣΙΑΣ, ΕΝΟΤΗΤΑ 2
ΦΥΛΛΟ ΕΡΓΑΣΙΑΣ, ΕΝΟΤΗΤΑ 2ΦΥΛΛΟ ΕΡΓΑΣΙΑΣ, ΕΝΟΤΗΤΑ 2
ΦΥΛΛΟ ΕΡΓΑΣΙΑΣ, ΕΝΟΤΗΤΑ 2
Alexandra Gerakini
 
Αβορίγινες
ΑβορίγινεςΑβορίγινες
Αβορίγινες
tapaidiatonkaision
 
Οδύσσεια, ραψωδία ε 50 251
Οδύσσεια, ραψωδία ε 50 251Οδύσσεια, ραψωδία ε 50 251
Οδύσσεια, ραψωδία ε 50 251
JoannaArtinou
 
ο κύκλος ζωής της πεταλούδας.
ο κύκλος ζωής της πεταλούδας.ο κύκλος ζωής της πεταλούδας.
ο κύκλος ζωής της πεταλούδας.
ΣΟΦΙΑ ΓΚΟΡΟΥ
 
η ζωή των κυκλαδιτών
η ζωή των κυκλαδιτώνη ζωή των κυκλαδιτών
η ζωή των κυκλαδιτώνdaskalogiannis
 
Επιστήμη
ΕπιστήμηΕπιστήμη
Επιστήμη
chavalesnick
 
Ο κύριος Τικ - Τακ και η στρατσαπαρισμένη νότα.
Ο κύριος Τικ - Τακ και η στρατσαπαρισμένη νότα.Ο κύριος Τικ - Τακ και η στρατσαπαρισμένη νότα.
Ο κύριος Τικ - Τακ και η στρατσαπαρισμένη νότα.
evamitro
 
Οδυσσέας Ελύτης
Οδυσσέας ΕλύτηςΟδυσσέας Ελύτης

What's hot (20)

ΟΔΥΣΣΕΙΑ Ζ : ΝΑΥΣΙΚΑ
ΟΔΥΣΣΕΙΑ Ζ : ΝΑΥΣΙΚΑΟΔΥΣΣΕΙΑ Ζ : ΝΑΥΣΙΚΑ
ΟΔΥΣΣΕΙΑ Ζ : ΝΑΥΣΙΚΑ
 
ο γλάρος άντον τσέχωφ
ο γλάρος   άντον τσέχωφο γλάρος   άντον τσέχωφ
ο γλάρος άντον τσέχωφ
 
ΠΕΛΙΑΣ ΚΑΙ ΙΑΣΟΝΑΣ-ΙΣΤΟΡΙΑ Γ ΤΑΞΗΣ
ΠΕΛΙΑΣ ΚΑΙ ΙΑΣΟΝΑΣ-ΙΣΤΟΡΙΑ Γ ΤΑΞΗΣΠΕΛΙΑΣ ΚΑΙ ΙΑΣΟΝΑΣ-ΙΣΤΟΡΙΑ Γ ΤΑΞΗΣ
ΠΕΛΙΑΣ ΚΑΙ ΙΑΣΟΝΑΣ-ΙΣΤΟΡΙΑ Γ ΤΑΞΗΣ
 
Αγγεία
ΑγγείαΑγγεία
Αγγεία
 
Εικονογραφικοί τύποι της Θεοτόκου
Εικονογραφικοί τύποι της ΘεοτόκουΕικονογραφικοί τύποι της Θεοτόκου
Εικονογραφικοί τύποι της Θεοτόκου
 
Βίκη Καχριμάνη Κουμανταρεας λογοτεχνια-β3
Βίκη Καχριμάνη Κουμανταρεας λογοτεχνια-β3Βίκη Καχριμάνη Κουμανταρεας λογοτεχνια-β3
Βίκη Καχριμάνη Κουμανταρεας λογοτεχνια-β3
 
''το σταφύλι ''
''το σταφύλι ''''το σταφύλι ''
''το σταφύλι ''
 
Κλέφτικο τραγούδι: Του Βασίλη
Κλέφτικο τραγούδι: Του ΒασίληΚλέφτικο τραγούδι: Του Βασίλη
Κλέφτικο τραγούδι: Του Βασίλη
 
Η ΑΝΑΛΗΨΗ ΤΟΥ ΧΡΙΣΤΟΥ-ΘΡΗΣΚΕΥΤΙΚΑ Γ ΤΑΞΗΣ
Η ΑΝΑΛΗΨΗ ΤΟΥ ΧΡΙΣΤΟΥ-ΘΡΗΣΚΕΥΤΙΚΑ Γ ΤΑΞΗΣΗ ΑΝΑΛΗΨΗ ΤΟΥ ΧΡΙΣΤΟΥ-ΘΡΗΣΚΕΥΤΙΚΑ Γ ΤΑΞΗΣ
Η ΑΝΑΛΗΨΗ ΤΟΥ ΧΡΙΣΤΟΥ-ΘΡΗΣΚΕΥΤΙΚΑ Γ ΤΑΞΗΣ
 
ΙΛΙΑΔΑ Τ - ΜΗΝΙΔΟΣ ΑΠΟΡΡΗΣΙΣ ( ΣΥΜΦΙΛΙΩΣΗ ΑΧΙΛΛΕΑ-ΑΓΑΜΕΜΝΟΝΑ)
ΙΛΙΑΔΑ Τ - ΜΗΝΙΔΟΣ ΑΠΟΡΡΗΣΙΣ ( ΣΥΜΦΙΛΙΩΣΗ ΑΧΙΛΛΕΑ-ΑΓΑΜΕΜΝΟΝΑ)ΙΛΙΑΔΑ Τ - ΜΗΝΙΔΟΣ ΑΠΟΡΡΗΣΙΣ ( ΣΥΜΦΙΛΙΩΣΗ ΑΧΙΛΛΕΑ-ΑΓΑΜΕΜΝΟΝΑ)
ΙΛΙΑΔΑ Τ - ΜΗΝΙΔΟΣ ΑΠΟΡΡΗΣΙΣ ( ΣΥΜΦΙΛΙΩΣΗ ΑΧΙΛΛΕΑ-ΑΓΑΜΕΜΝΟΝΑ)
 
ΤΟ ΛΙΟΝΤΑΡΙ ΤΗΣ ΝΕΜΕΑΣ-ΙΣΤΟΡΙΑ Γ ΤΑΞΗΣ
ΤΟ ΛΙΟΝΤΑΡΙ ΤΗΣ ΝΕΜΕΑΣ-ΙΣΤΟΡΙΑ Γ ΤΑΞΗΣΤΟ ΛΙΟΝΤΑΡΙ ΤΗΣ ΝΕΜΕΑΣ-ΙΣΤΟΡΙΑ Γ ΤΑΞΗΣ
ΤΟ ΛΙΟΝΤΑΡΙ ΤΗΣ ΝΕΜΕΑΣ-ΙΣΤΟΡΙΑ Γ ΤΑΞΗΣ
 
Συστατική επιστολή
Συστατική επιστολήΣυστατική επιστολή
Συστατική επιστολή
 
ΦΥΛΛΟ ΕΡΓΑΣΙΑΣ, ΕΝΟΤΗΤΑ 2
ΦΥΛΛΟ ΕΡΓΑΣΙΑΣ, ΕΝΟΤΗΤΑ 2ΦΥΛΛΟ ΕΡΓΑΣΙΑΣ, ΕΝΟΤΗΤΑ 2
ΦΥΛΛΟ ΕΡΓΑΣΙΑΣ, ΕΝΟΤΗΤΑ 2
 
Αβορίγινες
ΑβορίγινεςΑβορίγινες
Αβορίγινες
 
Οδύσσεια, ραψωδία ε 50 251
Οδύσσεια, ραψωδία ε 50 251Οδύσσεια, ραψωδία ε 50 251
Οδύσσεια, ραψωδία ε 50 251
 
ο κύκλος ζωής της πεταλούδας.
ο κύκλος ζωής της πεταλούδας.ο κύκλος ζωής της πεταλούδας.
ο κύκλος ζωής της πεταλούδας.
 
η ζωή των κυκλαδιτών
η ζωή των κυκλαδιτώνη ζωή των κυκλαδιτών
η ζωή των κυκλαδιτών
 
Επιστήμη
ΕπιστήμηΕπιστήμη
Επιστήμη
 
Ο κύριος Τικ - Τακ και η στρατσαπαρισμένη νότα.
Ο κύριος Τικ - Τακ και η στρατσαπαρισμένη νότα.Ο κύριος Τικ - Τακ και η στρατσαπαρισμένη νότα.
Ο κύριος Τικ - Τακ και η στρατσαπαρισμένη νότα.
 
Οδυσσέας Ελύτης
Οδυσσέας ΕλύτηςΟδυσσέας Ελύτης
Οδυσσέας Ελύτης
 

Similar to NWEA Growth and Teacher evaluation VA 9-13

Using Assessment Data for Educator and Student Growth
Using Assessment Data for Educator and Student GrowthUsing Assessment Data for Educator and Student Growth
Using Assessment Data for Educator and Student Growth
NWEA
 
NYSCOSS Conference Superintendents Training on Assessment 9 14
NYSCOSS Conference Superintendents Training on Assessment 9 14NYSCOSS Conference Superintendents Training on Assessment 9 14
NYSCOSS Conference Superintendents Training on Assessment 9 14
NWEA
 
Teacher evaluation presentation3 mass
Teacher evaluation presentation3  massTeacher evaluation presentation3  mass
Teacher evaluation presentation3 mass
John Cronin
 
Connecticut mesuring and modeling growth
Connecticut   mesuring and modeling growthConnecticut   mesuring and modeling growth
Connecticut mesuring and modeling growthJohn Cronin
 
Connecticut mesuring and modeling growth
Connecticut   mesuring and modeling growthConnecticut   mesuring and modeling growth
Connecticut mesuring and modeling growth
John Cronin
 
Connecticut mesuring and modeling growth
Connecticut   mesuring and modeling growthConnecticut   mesuring and modeling growth
Connecticut mesuring and modeling growthJohn Cronin
 
Data Summer
Data SummerData Summer
Data Summer
mark.richardson
 
Ed Reform Lecture - University of Arkansas
Ed Reform Lecture - University of ArkansasEd Reform Lecture - University of Arkansas
Ed Reform Lecture - University of ArkansasJohn Cronin
 
Using tests for teacher evaluation texas
Using tests for teacher evaluation texasUsing tests for teacher evaluation texas
Using tests for teacher evaluation texasNWEA
 
Colorado assessment summit_teacher_eval
Colorado assessment summit_teacher_evalColorado assessment summit_teacher_eval
Colorado assessment summit_teacher_eval
John Cronin
 
Teacher evaluation presentation oregon
Teacher evaluation presentation   oregonTeacher evaluation presentation   oregon
Teacher evaluation presentation oregon
John Cronin
 
Aeiou of k 3 literary checkpoints-3 3
Aeiou of k 3 literary checkpoints-3 3Aeiou of k 3 literary checkpoints-3 3
Aeiou of k 3 literary checkpoints-3 3
Keith Eades
 
RtIMTSS SPE 501-Spr
  RtIMTSS                                       SPE 501-Spr  RtIMTSS                                       SPE 501-Spr
RtIMTSS SPE 501-Spr
VannaJoy20
 
IASB Student Growth Presentation
IASB Student Growth PresentationIASB Student Growth Presentation
IASB Student Growth Presentation
Richard Voltz
 
Seven purposes presentation
Seven purposes presentationSeven purposes presentation
Seven purposes presentation
John Cronin
 
Taking control of the South Carolina Teacher Evaluation framework
Taking control of the South Carolina Teacher Evaluation frameworkTaking control of the South Carolina Teacher Evaluation framework
Taking control of the South Carolina Teacher Evaluation framework
NWEA
 
Teacher evaluation and goal setting connecticut
Teacher evaluation and goal setting   connecticutTeacher evaluation and goal setting   connecticut
Teacher evaluation and goal setting connecticut
John Cronin
 
National Superintendent's Dialogue
National Superintendent's DialogueNational Superintendent's Dialogue
National Superintendent's Dialogue
NWEA
 
Wsu Greg Lobdell September 2008 Data And Decision Making
Wsu Greg Lobdell September 2008 Data And Decision MakingWsu Greg Lobdell September 2008 Data And Decision Making
Wsu Greg Lobdell September 2008 Data And Decision MakingWSU Cougars
 

Similar to NWEA Growth and Teacher evaluation VA 9-13 (20)

Using Assessment Data for Educator and Student Growth
Using Assessment Data for Educator and Student GrowthUsing Assessment Data for Educator and Student Growth
Using Assessment Data for Educator and Student Growth
 
NYSCOSS Conference Superintendents Training on Assessment 9 14
NYSCOSS Conference Superintendents Training on Assessment 9 14NYSCOSS Conference Superintendents Training on Assessment 9 14
NYSCOSS Conference Superintendents Training on Assessment 9 14
 
Teacher evaluation presentation3 mass
Teacher evaluation presentation3  massTeacher evaluation presentation3  mass
Teacher evaluation presentation3 mass
 
Connecticut mesuring and modeling growth
Connecticut   mesuring and modeling growthConnecticut   mesuring and modeling growth
Connecticut mesuring and modeling growth
 
Connecticut mesuring and modeling growth
Connecticut   mesuring and modeling growthConnecticut   mesuring and modeling growth
Connecticut mesuring and modeling growth
 
Connecticut mesuring and modeling growth
Connecticut   mesuring and modeling growthConnecticut   mesuring and modeling growth
Connecticut mesuring and modeling growth
 
Data Summer
Data SummerData Summer
Data Summer
 
Ed Reform Lecture - University of Arkansas
Ed Reform Lecture - University of ArkansasEd Reform Lecture - University of Arkansas
Ed Reform Lecture - University of Arkansas
 
Using tests for teacher evaluation texas
Using tests for teacher evaluation texasUsing tests for teacher evaluation texas
Using tests for teacher evaluation texas
 
Colorado assessment summit_teacher_eval
Colorado assessment summit_teacher_evalColorado assessment summit_teacher_eval
Colorado assessment summit_teacher_eval
 
Teacher evaluation presentation oregon
Teacher evaluation presentation   oregonTeacher evaluation presentation   oregon
Teacher evaluation presentation oregon
 
Aeiou of k 3 literary checkpoints-3 3
Aeiou of k 3 literary checkpoints-3 3Aeiou of k 3 literary checkpoints-3 3
Aeiou of k 3 literary checkpoints-3 3
 
Oerc june 2014 final ppt combined
Oerc june 2014 final ppt combinedOerc june 2014 final ppt combined
Oerc june 2014 final ppt combined
 
RtIMTSS SPE 501-Spr
  RtIMTSS                                       SPE 501-Spr  RtIMTSS                                       SPE 501-Spr
RtIMTSS SPE 501-Spr
 
IASB Student Growth Presentation
IASB Student Growth PresentationIASB Student Growth Presentation
IASB Student Growth Presentation
 
Seven purposes presentation
Seven purposes presentationSeven purposes presentation
Seven purposes presentation
 
Taking control of the South Carolina Teacher Evaluation framework
Taking control of the South Carolina Teacher Evaluation frameworkTaking control of the South Carolina Teacher Evaluation framework
Taking control of the South Carolina Teacher Evaluation framework
 
Teacher evaluation and goal setting connecticut
Teacher evaluation and goal setting   connecticutTeacher evaluation and goal setting   connecticut
Teacher evaluation and goal setting connecticut
 
National Superintendent's Dialogue
National Superintendent's DialogueNational Superintendent's Dialogue
National Superintendent's Dialogue
 
Wsu Greg Lobdell September 2008 Data And Decision Making
Wsu Greg Lobdell September 2008 Data And Decision MakingWsu Greg Lobdell September 2008 Data And Decision Making
Wsu Greg Lobdell September 2008 Data And Decision Making
 

More from NWEA

Teacher goal setting in texas
Teacher goal setting in texasTeacher goal setting in texas
Teacher goal setting in texas
NWEA
 
Maximizing student assessment systems cronin
Maximizing student assessment systems   croninMaximizing student assessment systems   cronin
Maximizing student assessment systems croninNWEA
 
ND Assessment Program Alignment
ND Assessment Program AlignmentND Assessment Program Alignment
ND Assessment Program Alignment
NWEA
 
Nd evaluations using growth data 4 13
Nd evaluations using growth data 4 13Nd evaluations using growth data 4 13
Nd evaluations using growth data 4 13
NWEA
 
Dylan Wiliam seminar for district leaders accelerate learning with formative...
Dylan Wiliam seminar for district leaders  accelerate learning with formative...Dylan Wiliam seminar for district leaders  accelerate learning with formative...
Dylan Wiliam seminar for district leaders accelerate learning with formative...
NWEA
 
SC Assessment Summit March 2013
SC Assessment Summit March 2013SC Assessment Summit March 2013
SC Assessment Summit March 2013
NWEA
 
Assessment Program Alignment: Making Essential Connections Between Assessment...
Assessment Program Alignment: Making Essential Connections Between Assessment...Assessment Program Alignment: Making Essential Connections Between Assessment...
Assessment Program Alignment: Making Essential Connections Between Assessment...
NWEA
 
Predicting Student Performance on the MSP-HSPE: Understanding, Conducting, an...
Predicting Student Performance on the MSP-HSPE: Understanding, Conducting, an...Predicting Student Performance on the MSP-HSPE: Understanding, Conducting, an...
Predicting Student Performance on the MSP-HSPE: Understanding, Conducting, an...
NWEA
 
KLT TLC Leader Materials Set Excerpt
KLT TLC Leader Materials Set ExcerptKLT TLC Leader Materials Set Excerpt
KLT TLC Leader Materials Set Excerpt
NWEA
 
What's New at NWEA: Children’s Progress Academic Assessment (CPAA)
What's New at NWEA: Children’s Progress Academic Assessment (CPAA)What's New at NWEA: Children’s Progress Academic Assessment (CPAA)
What's New at NWEA: Children’s Progress Academic Assessment (CPAA)
NWEA
 
Predicting Proficiency… How MAP Predicts State Test Performance
Predicting Proficiency… How MAP Predicts State Test PerformancePredicting Proficiency… How MAP Predicts State Test Performance
Predicting Proficiency… How MAP Predicts State Test Performance
NWEA
 
Connecting the Dots: CCSS, DI, NWEA, Help!
Connecting the Dots: CCSS, DI, NWEA, Help!Connecting the Dots: CCSS, DI, NWEA, Help!
Connecting the Dots: CCSS, DI, NWEA, Help!
NWEA
 
What's New at NWEA: Keeping Learning on Track
What's New at NWEA: Keeping Learning on TrackWhat's New at NWEA: Keeping Learning on Track
What's New at NWEA: Keeping Learning on Track
NWEA
 
What’s New at NWEA: Power of Teaching
What’s New at NWEA: Power of TeachingWhat’s New at NWEA: Power of Teaching
What’s New at NWEA: Power of Teaching
NWEA
 
What’s New at NWEA: Skills Pointer
What’s New at NWEA: Skills PointerWhat’s New at NWEA: Skills Pointer
What’s New at NWEA: Skills Pointer
NWEA
 
Finding Meaning in NWEA Data
Finding Meaning in NWEA DataFinding Meaning in NWEA Data
Finding Meaning in NWEA Data
NWEA
 
An Alternative Method to Rate Teacher Performance
An Alternative Method to Rate Teacher PerformanceAn Alternative Method to Rate Teacher Performance
An Alternative Method to Rate Teacher Performance
NWEA
 
Data Driven Learning and the iPad
Data Driven Learning and the iPadData Driven Learning and the iPad
Data Driven Learning and the iPad
NWEA
 
21st Century Teaching and Learning
21st Century Teaching and Learning21st Century Teaching and Learning
21st Century Teaching and Learning
NWEA
 
Grading and Reporting Student Learning
Grading and Reporting Student LearningGrading and Reporting Student Learning
Grading and Reporting Student Learning
NWEA
 

More from NWEA (20)

Teacher goal setting in texas
Teacher goal setting in texasTeacher goal setting in texas
Teacher goal setting in texas
 
Maximizing student assessment systems cronin
Maximizing student assessment systems   croninMaximizing student assessment systems   cronin
Maximizing student assessment systems cronin
 
ND Assessment Program Alignment
ND Assessment Program AlignmentND Assessment Program Alignment
ND Assessment Program Alignment
 
Nd evaluations using growth data 4 13
Nd evaluations using growth data 4 13Nd evaluations using growth data 4 13
Nd evaluations using growth data 4 13
 
Dylan Wiliam seminar for district leaders accelerate learning with formative...
Dylan Wiliam seminar for district leaders  accelerate learning with formative...Dylan Wiliam seminar for district leaders  accelerate learning with formative...
Dylan Wiliam seminar for district leaders accelerate learning with formative...
 
SC Assessment Summit March 2013
SC Assessment Summit March 2013SC Assessment Summit March 2013
SC Assessment Summit March 2013
 
Assessment Program Alignment: Making Essential Connections Between Assessment...
Assessment Program Alignment: Making Essential Connections Between Assessment...Assessment Program Alignment: Making Essential Connections Between Assessment...
Assessment Program Alignment: Making Essential Connections Between Assessment...
 
Predicting Student Performance on the MSP-HSPE: Understanding, Conducting, an...
Predicting Student Performance on the MSP-HSPE: Understanding, Conducting, an...Predicting Student Performance on the MSP-HSPE: Understanding, Conducting, an...
Predicting Student Performance on the MSP-HSPE: Understanding, Conducting, an...
 
KLT TLC Leader Materials Set Excerpt
KLT TLC Leader Materials Set ExcerptKLT TLC Leader Materials Set Excerpt
KLT TLC Leader Materials Set Excerpt
 
What's New at NWEA: Children’s Progress Academic Assessment (CPAA)
What's New at NWEA: Children’s Progress Academic Assessment (CPAA)What's New at NWEA: Children’s Progress Academic Assessment (CPAA)
What's New at NWEA: Children’s Progress Academic Assessment (CPAA)
 
Predicting Proficiency… How MAP Predicts State Test Performance
Predicting Proficiency… How MAP Predicts State Test PerformancePredicting Proficiency… How MAP Predicts State Test Performance
Predicting Proficiency… How MAP Predicts State Test Performance
 
Connecting the Dots: CCSS, DI, NWEA, Help!
Connecting the Dots: CCSS, DI, NWEA, Help!Connecting the Dots: CCSS, DI, NWEA, Help!
Connecting the Dots: CCSS, DI, NWEA, Help!
 
What's New at NWEA: Keeping Learning on Track
What's New at NWEA: Keeping Learning on TrackWhat's New at NWEA: Keeping Learning on Track
What's New at NWEA: Keeping Learning on Track
 
What’s New at NWEA: Power of Teaching
What’s New at NWEA: Power of TeachingWhat’s New at NWEA: Power of Teaching
What’s New at NWEA: Power of Teaching
 
What’s New at NWEA: Skills Pointer
What’s New at NWEA: Skills PointerWhat’s New at NWEA: Skills Pointer
What’s New at NWEA: Skills Pointer
 
Finding Meaning in NWEA Data
Finding Meaning in NWEA DataFinding Meaning in NWEA Data
Finding Meaning in NWEA Data
 
An Alternative Method to Rate Teacher Performance
An Alternative Method to Rate Teacher PerformanceAn Alternative Method to Rate Teacher Performance
An Alternative Method to Rate Teacher Performance
 
Data Driven Learning and the iPad
Data Driven Learning and the iPadData Driven Learning and the iPad
Data Driven Learning and the iPad
 
21st Century Teaching and Learning
21st Century Teaching and Learning21st Century Teaching and Learning
21st Century Teaching and Learning
 
Grading and Reporting Student Learning
Grading and Reporting Student LearningGrading and Reporting Student Learning
Grading and Reporting Student Learning
 

Recently uploaded

Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup   New Member Orientation and Q&A (May 2024).pdfWelcome to TechSoup   New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
TechSoup
 
Instructions for Submissions thorugh G- Classroom.pptx
Instructions for Submissions thorugh G- Classroom.pptxInstructions for Submissions thorugh G- Classroom.pptx
Instructions for Submissions thorugh G- Classroom.pptx
Jheel Barad
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
siemaillard
 
How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...
Jisc
 
"Protectable subject matters, Protection in biotechnology, Protection of othe...
"Protectable subject matters, Protection in biotechnology, Protection of othe..."Protectable subject matters, Protection in biotechnology, Protection of othe...
"Protectable subject matters, Protection in biotechnology, Protection of othe...
SACHIN R KONDAGURI
 
678020731-Sumas-y-Restas-Para-Colorear.pdf
678020731-Sumas-y-Restas-Para-Colorear.pdf678020731-Sumas-y-Restas-Para-Colorear.pdf
678020731-Sumas-y-Restas-Para-Colorear.pdf
CarlosHernanMontoyab2
 
1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx
JosvitaDsouza2
 
The basics of sentences session 5pptx.pptx
The basics of sentences session 5pptx.pptxThe basics of sentences session 5pptx.pptx
The basics of sentences session 5pptx.pptx
heathfieldcps1
 
Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345
beazzy04
 
The geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideasThe geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideas
GeoBlogs
 
Lapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdfLapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdf
Jean Carlos Nunes Paixão
 
Operation Blue Star - Saka Neela Tara
Operation Blue Star   -  Saka Neela TaraOperation Blue Star   -  Saka Neela Tara
Operation Blue Star - Saka Neela Tara
Balvir Singh
 
Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
Atul Kumar Singh
 
Unit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdfUnit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdf
Thiyagu K
 
Biological Screening of Herbal Drugs in detailed.
Biological Screening of Herbal Drugs in detailed.Biological Screening of Herbal Drugs in detailed.
Biological Screening of Herbal Drugs in detailed.
Ashokrao Mane college of Pharmacy Peth-Vadgaon
 
The Accursed House by Émile Gaboriau.pptx
The Accursed House by Émile Gaboriau.pptxThe Accursed House by Émile Gaboriau.pptx
The Accursed House by Émile Gaboriau.pptx
DhatriParmar
 
Supporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptxSupporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptx
Jisc
 
The approach at University of Liverpool.pptx
The approach at University of Liverpool.pptxThe approach at University of Liverpool.pptx
The approach at University of Liverpool.pptx
Jisc
 
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCECLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
BhavyaRajput3
 
The Roman Empire A Historical Colossus.pdf
The Roman Empire A Historical Colossus.pdfThe Roman Empire A Historical Colossus.pdf
The Roman Empire A Historical Colossus.pdf
kaushalkr1407
 

Recently uploaded (20)

Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup   New Member Orientation and Q&A (May 2024).pdfWelcome to TechSoup   New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
 
Instructions for Submissions thorugh G- Classroom.pptx
Instructions for Submissions thorugh G- Classroom.pptxInstructions for Submissions thorugh G- Classroom.pptx
Instructions for Submissions thorugh G- Classroom.pptx
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
 
How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...
 
"Protectable subject matters, Protection in biotechnology, Protection of othe...
"Protectable subject matters, Protection in biotechnology, Protection of othe..."Protectable subject matters, Protection in biotechnology, Protection of othe...
"Protectable subject matters, Protection in biotechnology, Protection of othe...
 
678020731-Sumas-y-Restas-Para-Colorear.pdf
678020731-Sumas-y-Restas-Para-Colorear.pdf678020731-Sumas-y-Restas-Para-Colorear.pdf
678020731-Sumas-y-Restas-Para-Colorear.pdf
 
1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx
 
The basics of sentences session 5pptx.pptx
The basics of sentences session 5pptx.pptxThe basics of sentences session 5pptx.pptx
The basics of sentences session 5pptx.pptx
 
Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345
 
The geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideasThe geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideas
 
Lapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdfLapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdf
 
Operation Blue Star - Saka Neela Tara
Operation Blue Star   -  Saka Neela TaraOperation Blue Star   -  Saka Neela Tara
Operation Blue Star - Saka Neela Tara
 
Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
 
Unit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdfUnit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdf
 
Biological Screening of Herbal Drugs in detailed.
Biological Screening of Herbal Drugs in detailed.Biological Screening of Herbal Drugs in detailed.
Biological Screening of Herbal Drugs in detailed.
 
The Accursed House by Émile Gaboriau.pptx
The Accursed House by Émile Gaboriau.pptxThe Accursed House by Émile Gaboriau.pptx
The Accursed House by Émile Gaboriau.pptx
 
Supporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptxSupporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptx
 
The approach at University of Liverpool.pptx
The approach at University of Liverpool.pptxThe approach at University of Liverpool.pptx
The approach at University of Liverpool.pptx
 
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCECLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
 
The Roman Empire A Historical Colossus.pdf
The Roman Empire A Historical Colossus.pdfThe Roman Empire A Historical Colossus.pdf
The Roman Empire A Historical Colossus.pdf
 

NWEA Growth and Teacher evaluation VA 9-13

  • 1. Andy Hegedus, Ed. D. Kingsbury Center at NWEA September 2013 Measuring student growth accurately – It makes a difference in your world!
  • 2. • Goal is to improve student achievement through improving work force performance over time – Just like any profession there is variability in the performance • Belief system driving policy – Rigorous performance evaluation process, and the rewards, support, or removal of teachers that comes with it, is a major lever Overview/Setting the stage
  • 3. Evaluator Rating Ineffective Developing Effective Highly Effective What is happening just can’t be right! 5800 teachers evaluated between January and May 2012, The Atlanta Journal-Constitution January 7, 2013 “Statistically, this flies in the face of our academic achievement levels. These numbers just doesn’t jibe with reality,” Millar said. “If the Georgia evaluation system is going to be based on these type of statistics, I wouldn’t see us going forward with it because, just statistically, it can’t be valid.
  • 4. Focus should likely be elsewhere (on the 99%) Executive Brief: Tracking Trends in Employee Turnover, Retrieved March 11, 2013, http://www.shrm.org/research/benchmarks/documents/trends%20in%20turnover_final.pdf Remaining Workforce: • Effectiveness of surrounding system • Powerful Professional Development • Performance management system explicitly designed to improve performance Voluntary Turnover: • Working conditions • Induction and support Involuntary Turnover: • Financial stability • Keep the best Remaining 85% Voluntary 9% Involuntary 6% 2011 Percentage
  • 5. • Increase your understanding about various urgent assessment related topics – Ask better questions – Useful for making all types of decisions with data • Follow along and ask questions at any time – Slideshare.net • Will pause during transitions for you to discuss “Ah-Ha’s” with a neighbor My Purpose
  • 6. 1. Selection of an appropriate test: • Used for the purpose for which it was designed (proficiency vs. growth) • Can accurately measure the test performance of all students 2. Alignment between the content assessed and the content to be taught 3. Adjust for context/control for factors outside a teacher’s direct control (value-added) Three primary conditions for using tests for teacher evaluation
  • 7. 1. Evaluation process that focuses on helping teachers improve 2. The principal or designated evaluator should control the evaluation 3. Tests should inform the process, not dictate or decide it 4. Multiple measures should be used over time What NWEA supports
  • 8. 1. Use of tests as part of a dialogue to help teachers set improvement goals 2. Use of tests as a “yellow light” to identify teachers who may be in need of additional support or assistance Two approaches we like
  • 9. • What we’ve known to be true is now being shown to be true – Using data thoughtfully improves student achievement – 12% mathematics, 13% reading • There are dangers present however – Unintended Consequences Go forth thoughtfully with care Slotnik, W. J. , Smith, M. D., It’s more than money, February 2013, retrieved from http://www.ctacusa.com/PDFs/MoreThanMoney-report.pdf
  • 10. “What gets measured (and attended to), gets done” Remember the old adage?
  • 11. • NCLB –Cast light on inequities –Improved performance of “Bubble Kids” –Narrowed taught curriculum An infamous example
  • 12. It’s what we do that counts A patient’s health doesn’t change because we know their blood pressure It’s our response that makes all the difference
  • 13. 1. Shifting towards tighter state level control – a shift of decision-making away from local control 2. Our nation moved from a model of education reform that focused on fixing schools to a model that is focused on fixing the teaching profession Policy shifts make today’s conversation inevitable
  • 14. Be considerate of the continuum of stakes involved Support Compensate Terminate Increasing levels of required rigor Increasingrisk
  • 15. The use of value-added data for high stakes personnel decisions does not yet have a strong, coherent, body of case law. Expect litigation if value-added results are the lynchpin evidence for a teacher-dismissal case until a body of case law is established. • Due Process • Disparate impact doctrine Potential Litigation Issues
  • 16. Baker B., Oluwole, J., Green, P. (2013). The legal consequences of mandating high stakes decisions based on low quality information: Teacher evaluation in the Race to the Top Era. Education Policy Analysis Archives. Vol 21. No 5. Suggested reading
  • 17. Is the progress produced by this teacher dramatically different than teaching peers who deliver instruction to comparable students in comparable situations? What question is being answered in support of using data in evaluating teachers?
  • 18. Marcus Normal Growth Needed Growth Marcus’ growth College readiness standard
  • 19. The Test The Growth Metric The Evaluation The Rating There are four key steps required to answer this question Top-Down Model
  • 20. Assessment 1 Goal Setting Assessment(s) Results and Analysis Evaluation (Rating) How does the other popular process work? Bottom-Up Model (Student Learning Objectives) Understanding all four of the top-down elements are needed here
  • 21. The Test The Growth Metric The Evaluation The Rating Let’s begin at the beginning
  • 22. 3rd Grade ELA Standards 3rd Grade ELA Teacher? 3rd Grade Social Studies Teacher? Elem. Art Teacher? What is measured should be aligned to what is to be taught 1. Answer questions to demonstrate understanding of text…. 2. Determine the main idea of a text…. 3. Determine the meaning of general academic and domain specific words… Would you use MAP in the evaluation of a…. ~30% of teachers teach in tested subjects and grades The Other 69 Percent: Fairly Rewarding the Performance of Teachers of Nontested Subjects and Grades, http://www.cecr.ed.gov/guides/other69Percent.pdf
  • 23. • Assessments should align with the teacher’s instructional responsibility – Specific advanced content • HS teachers teaching discipline specific content – Especially 11th and 12th grade • MS teachers teaching HS content to advanced students – Non-tested subjects • School-wide results are more likely “professional responsibility” rather than reflecting competence – HS teachers providing remedial services What is measured should be aligned to what is to be taught
  • 24. • Many assessments are not designed to measure growth • Others do not measure growth equally well for all students The purpose and design of the instrument is significant
  • 25. Both status and growth are important but growth leads Beginning Literacy Adult Reading 5th Grade x x Time 1 Time 2 Status Two assumptions: 1. Measurement accuracy, and 2. Vertical scale
  • 26. Accurately measuring growth depends on accurately measuring achievement
  • 27. How about measuring height? What if the pencil isn’t very level? What if we marked with sidewalk chalk?
  • 28. Measurement Accuracy A test for you Beginning Literacy Adult Reading 5th Grade x x Time 1 Time 2 Pop Quiz: What’s bigger? 1. Time 1 Error or Time 2 Error alone 2. Time 2 minus Time 1 Error (Growth)
  • 29. Questions surrounding the student’s achievement level The more questions the merrier What does it take to accurately measure achievement?
  • 30. Teachers encounter a distribution of student performance Beginning Literacy Adult Reading 5th Grade x x x x x x x x x x x x x x x Grade Level Performance
  • 31. Adaptive testing works differently Item bank can span full range of achievement
  • 32. Items available need to match student ability California STAR NWEA MAP
  • 33. 5th Grade Level Items These differences impact measurement error .00 .02 .04 .06 .08 .10 .12 Information 170 180 190 200 210 220 230 240 Scale Score Pass/ Proficient Fully Adaptive Test Significantly Different Error 26th Fail/Basic Pass/Advanced 77th 160 Constrained Adaptive or Paper/Pencil Test
  • 34. To determine growth, achievement measurements must be related through a scale
  • 35. If I was measured as: 5’ 9” And a year later I was: 1.82m Did I grow? Yes. ~ 2.5” How do you know? Let’s measure height again
  • 36. Traditional assessment uses items reflecting the grade level standards Beginning Literacy Adult Reading 4th Grade 5th Grade 6th Grade Grade Level Standards Traditional Assessment Item Bank
  • 37. Traditional assessment uses items reflecting the grade level standards Beginning Literacy Adult Reading 4th Grade 5th Grade 6th Grade Grade Level Standards Grade Level Standards Overlap allows linking and scale construction Grade Level Standards
  • 38. • Study on impact of assessment selection on VAM results –Defined a misidentified teacher as one who appeared to have growth which was incorrect by more than one-half a year1 • Less than .5 years or more than 1.5 years Error can change your life!!! 1Woodworth, J.L., Does Assessment Selection Matter When Computing Teacher Value-Added Measures?, http://www.kingsburycenter.org/sites/default/files/James%20Woodworth%20Data%20Award%20Research%20Brief.pdf
  • 39. • “. . . in the 25 student (single class) simulations. At the 25 student level, the VAM based on the TAKS misidentifies 35% of all teachers, whereas, the VAM based on the MAP misidentifies only 1% of teachers.” Initial measurement error is a significant issue in AYP and Teacher Evaluation work Error can change your life!!!
  • 40. Black, P. and Wiliam, D.(2007) 'Large-scale assessment systems: Design principles drawn from international comparisons', Measurement: Interdisciplinary Research & Perspective, 5: 1, 1 — 53 • …when science is defined in terms of knowledge of facts that are taught in school…(then) those students who have been taught the facts will know them, and those who have not will…not. A test that assesses these skills is likely to be highly sensitive to instruction. The instrument must be able to detect instruction
  • 41. Black, P. and Wiliam, D.(2007) 'Large-scale assessment systems: Design principles drawn from international comparisons', Measurement: Interdisciplinary Research & Perspective, 5: 1, 1 — 53 • When ability in science is defined in terms of scientific reasoning…achievement will be less closely tied to age and exposure, and more closely related to general intelligence. In other words, science reasoning tasks are relatively insensitive to instruction. The more complex, the harder to detect and attribute to one teacher
  • 42. • Tests specifically designed to inform classroom instruction and school improvement in formative ways No incentive in the system for inaccurate data Using tests in high stakes ways creates new dynamic
  • 43. -6.00 -4.00 -2.00 0.00 2.00 4.00 6.00 8.00 1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51 53 55 57 59 61 63 65 67 69 71 Students taking 10+ minutes longer spring than fall All other students New phenomenon when used as part of a compensation program Mean value-added growth by school
  • 44. Cheating Atlanta Public Schools Crescendo Charter Schools Philadelphia Public Schools Washington DC Public Schools Houston Independent School District Michigan Public Schools
  • 45. When teachers are evaluated on growth using a once per year assessment, one teacher who cheats disadvantages the next teacher Other consequence
  • 46. Other issues Proctoring Proctoring both with and without the classroom teacher raises possible problems Documentation that test administration procedures were properly followed is important Monitoring testing conditions assists with reliability
  • 47. Testing is complete . . . What is useful to answer our question? The Test The Growth Metric The Evaluation The Rating
  • 48. The problem with spring-spring testing 3/11 4/11 5/11 6/11 7/11 8/11 9/11 10/11 11/11 12/11 1/12 2/12 3/12 Teacher 1 Summer Teacher 2
  • 49. • When possible use a spring – fall – spring approach • Measure summer loss and incentivize schools and teachers to minimize it • Measure teacher performance fall to spring, giving as much instructional time as possible between assessments • Monitor testing conditions to minimize gaming of fall spring results A better approach
  • 50. Without context what is “Good”? Beginning Reading Adult Literacy RIT National Percentile NWEA Norms Study NWEA Scale CollegeReadiness Benchmarks ACT PerformanceLevels State Test “Meets” Proficiency PerformanceLevels Common Core Proficient
  • 51. 0 10 20 30 40 50 60 70 80 90 100 Grade 2 Grade 3 Grade 4 Grade 5 Grade 6 Grade 7 Grade 8 Reading Math The metric matters - Let’s go underneath “Proficiency” Difficulty of Virginia SOL Pass/Proficient Cut Score NationalPercentile College Readiness A study of the alignment of the NWEA RIT scale with the Virginia Standards of Learning (SOL), December 2012
  • 52. Difficulty of ACT college readiness standards
  • 53. The metric matters - Let’s go underneath “Proficiency” Dahlin, M. and Durant, S., The State of Proficiency, Kingsbury Center at NWEA, July 2011
  • 54. NumberofStudents Fall RIT Mathematics No Change Down Up What gets measured and attended to really does matter Proficiency College Readiness One district’s change in 5th grade mathematics performance relative to the KY proficiency cut scores
  • 55. NumberofStudents Student’s score in fall Mathematics Below projected growth Met or above projected growth Number of 5th grade students meeting projected mathematics growth in the same district Changing from Proficiency to Growth means all kids matter
  • 56. How can we make it fair? The Test The Growth Metric The Evaluation The Rating
  • 57. Context – 2011 NWEA Student Norms Starting Score: 200 Fall RIT Score Subject: Reading Grade: 4th 7 RIT FRL vs. non-FRL? IEP vs. non-IEP? ESL vs. non-ESL? Outside of a teacher’s direct control
  • 58. A Visual Representation of Value Added Spring 4th Grade MAP Test Student A Spring RIT Score 209 RIT Score 207 (Average Spring Score for Similar Students) Value Added (+2 RIT Score) Student A Fall RIT Score 200 Fall 4th Grade MAP Test
  • 59. • What if I skip this step? – Comparison is likely against normative data so the comparison is to “typical kids in typical settings” • How fair is it to disregard context? – Good teacher – bad school – Good teacher – challenging kids Does your personal goal setting consider context? Consider . . .
  • 60. • Lack of a historical context – What has this teacher and these students done in the past? • Lack of comparison groups – What have other teachers done in the past? • What is the objective? – Is the objective to meet a standard of performance or demonstrate improvement? • Do you set safe goals or stretch goals? Challenges with goal setting
  • 61. • Value added models control for a variety of classroom, school level, and other conditions – Proven statistical methods – All attempt to minimize error – Variables outside controls are assumed as random Value-added is science
  • 62. • Control for measurement error – All models attempt to address this issue • Population size • Multiple data points – Error is compounded with combining two test events – Nevertheless, many teachers’ value-added scores will fall within the range of statistical error A variety of errors means more stability only at the extremes
  • 63. -12.00 -11.00 -10.00 -9.00 -8.00 -7.00 -6.00 -5.00 -4.00 -3.00 -2.00 -1.00 0.00 1.00 2.00 3.00 4.00 5.00 6.00 7.00 8.00 9.00 10.00 11.00 12.00 AverageGrowthIndexScoreandRange Mathematics Growth Index Distribution by Teacher - Validity Filtered Q5 Q4 Q3 Q2 Q1 Each line in this display represents a single teacher. The graphic shows the average growth index score for each teacher (green line), plus or minus the standard error of the growth index estimate (black line). We removed students who had tests of questionable validity and teachers with fewer than 20 students. Range of teacher value-added estimates
  • 64. With one teacher, error means a lot
  • 65. • Value-added models assume that variation is caused by randomness if not controlled for explicitly – Young teachers are assigned disproportionate numbers of students with poor discipline records – Parent requests for the “best” teachers are honored • Sound educational reasons for placement are likely to be defensible Assumption of randomness can have risk implications
  • 66. “The findings indicate that these modeling choices can significantly influence outcomes for individual teachers, particularly those in the tails of the performance distribution who are most likely to be targeted by high-stakes policies.” Ballou, D., Mokher, C. and Cavalluzzo, L. (2012) Using Value-Added Assessment for Personnel Decisions: How Omitted Variables and Model Specification Influence Teachers’ Outcomes. Instability at the tails of the distribution LA Times Teacher #1 LA Times Teacher #2
  • 67. How tests are used to evaluate teachers The Test The Growth Metric The Evaluation The Rating
  • 68. • How would you translate a rank order to a rating? • Data can be provided • Value judgment ultimately the basis for setting cut scores for points or rating Translation into ratings can be difficult to inform with data
  • 69. • What is far below a district’s expectation is subjective • What about • Obligation to help teachers improve? • Quality of replacement teachers? Decisions are value based, not empirical
  • 70. • System for combining elements and producing a rating is also a value based decision –Multiple measures and principal judgment must be included –Evaluate the extremes to make sure it makes sense Even multiple measures need to be used well
  • 71. Evaluator Rating Ineffective Developing Effective Highly Effective Remember this? 5800 teachers evaluated between January and May 2012, The Atlanta Journal-Constitution January 7, 2013
  • 72. Leadership Courage Is A Key 0 1 2 3 4 5 Teacher 1 Teacher 2 Teacher 3 Ratings can be driven by the assessment Observation Assessment Real or Noise?
  • 73. If evaluators do not differentiate their ratings, then all differentiation comes from the test Big Message
  • 74. 1. Selection of an appropriate test: • Used for the purpose for which it was designed (proficiency vs. growth) • Can accurately measure the test performance of all students 2. Alignment between the content assessed and the content to be taught 3. Need for context for growth/control for factors outside a teacher’s direct control (value-added) Please be thoughtful about . . .
  • 75. • Presentations and other recommended resources are available at: – www.nwea.org – www.kingsburycenter.org – Slideshare.net • Contacting us: NWEA Main Number 503-624-1951 E-mail: andy.hegedus@nwea.org More information

Editor's Notes

  1. Concept – If we fix schools we fix education. Schools actually did improve during this period.Race to the Top, Gates Foundation, Teach for America…Signaled in a number of waysNCLB about fixing schools – 100% Proficient by 2014Punishments for AYP – SES, Choice, RestructuringObama switch – Race to the TopFixing or improving teaching and the teaching professionRecruiting teachers from alternative careersMove from holding schools accountable to holding teachers accountable. Wrong no. Different Yes.David Brooks – Aug 2010 – Atlantic Monthly – Teachers are fair game – Teachers under scrutiny – Somewhat unfairlyBOE are asking about test based accountabilityCharleston SC – Any teacher without 50% of students on growth norm – Yr 1 on report, Yr 2 only rehired by approval by BOE50% Yr 1, 25% year 2 to be rehiredOur goal – Make sure you are prepared. Understand the risk. Proper ways to implement including legal issues. Clarify some of the implications – Very complex – Prepare you and a prudent course
  2. Teacher evaluations and the use of data in them can take many forms. You can use them for supporting teachers and their improvement. You can use the evaluations to compensate teachers or groups of teachers differently or you can use them in their highest stakes way to terminate teachers. The higher the stakes put on the evaluation, the more risk there is to you and your organization from a political, legal, and equity perspective. Most people naturally respond with increasing the levels of rigor put into designing the process as a way to ameliorate the risk. One fact is that the risk can’t be eliminated. Our goal – Make sure you are prepared. Understand the risk. Proper ways to implement including legal issues. Clarify some of the implications – Very complex – Prepare you and a prudent course
  3. This is the value added metricNot easy to make nuanced decisions. Can learn about the ends.
  4. Contrast with what value added communicatesPlot normal growth for Marcus vs anticipated growth – value added. If you ask whether the teachers provided value added, the answer is Yes.Other line is what is needed for college readinessBlue line is what is used to evaluate the teacher. Is he on the line the parents want him to be on? Probably not.Don’t focus on one at the expense of the otherNCLB – AYP vs what the parent really wants for goal settingCan be come so focused on measuring teachers that we lose sight of what parents valueWe are better off moving towards the kids aspirationsAs a parent I didn’t care if the school made AYP. I cared if my kids got the courses that helped them go where they want to go.
  5. Steps are quite important. People tend to skip some of these.Kids take a test – important that the test is aligned to instruction being givenMetric – look at growth vs growth norm and calculate a growth index. Two benefits – Very transparent/Simple.People tend to use our growth norms – if you hit 60% for a grade level within a school you are dong well.Norms – growth of a kid or group of kids compared to a nationally representative sample of studentsWhy isn’t this value added?Not all teachers can be compared to a nationally representative sample because they don’t teach kids that are just like the national sampleThe third step controls for variables unique to the teacher’s classroom or environmentFourth step – rating – how much below average before the district takes action or how much above before someone gets performance pay. Particular challenge in NY state right now. Law requires it.
  6. Steps are quite important. People tend to skip some of these.Kids take a test – important that the test is aligned to instruction being givenMetric – look at growth vs growth norm and calculate a growth index. Two benefits – Very transparent/Simple.People tend to use our growth norms – if you hit 60% for a grade level within a school you are dong well.Norms – growth of a kid or group of kids compared to a nationally representative sample of studentsWhy isn’t this value added?Not all teachers can be compared to a nationally representative sample because they don’t teach kids that are just like the national sampleThe third step controls for variables unique to the teacher’s classroom or environmentFourth step – rating – how much below average before the district takes action or how much above before someone gets performance pay. Particular challenge in NY state right now. Law requires it.
  7. Common core – very ambitious things they want to measure – tackle things on an AP test. Write and show their work.A CC assessment to evaluate teachers can be a problem.Raise your hand if you know what the capital of Chile is. Santiago. Repeat after me. We will review in a couple of minutes. Facts can be relatively easily acquired and are instructionally sensitive. If you expose kids to facts in a meaningful and engaging ways, it is sensitive to instruction.
  8. State assessment designed to measure proficiency – many items in the middle not at the endsMust use multiple points of data over time to measure this. We also believe that a principal should be more in control of the evaluation than the test – Principal and Teacher leaders are what changes schools
  9. 5th grade NY reading cut scores shown
  10. Problem – insensitive to instructionPrereq skills – writing skills. Given events on N. Africa today, Q requires a lot of pre-req knowledge. Need to know the story. Put it into writing. Reasoning skills to put it together with events today. And I need to know what is going on today as well. One doesn’t develop this entire set of skills in the 9 months of instruction.Common core is what we want. Just not for teacher evaluation.These questions are not that sensitive to instruction. Problematic when we hold teachers accountable for instruction or growth.
  11. Problem – insensitive to instructionPrereq skills – writing skills. Given events on N. Africa today, Q requires a lot of pre-req knowledge. Need to know the story. Put it into writing. Reasoning skills to put it together with events today. And I need to know what is going on today as well. One doesn’t develop this entire set of skills in the 9 months of instruction.Common core is what we want. Just not for teacher evaluation.These questions are not that sensitive to instruction. Problematic when we hold teachers accountable for instruction or growth.
  12. Steps are quite important. People tend to skip some of these.Kids take a test – important that the test is aligned to instruction being givenMetric – look at growth vs growth norm and calculate a growth index. Two benefits – Very transparent/Simple.People tend to use our growth norms – if you hit 60% for a grade level within a school you are dong well.Norms – growth of a kid or group of kids compared to a nationally representative sample of studentsWhy isn’t this value added?Not all teachers can be compared to a nationally representative sample because they don’t teach kids that are just like the national sampleThe third step controls for variables unique to the teacher’s classroom or environmentFourth step – rating – how much below average before the district takes action or how much above before someone gets performance pay. Particular challenge in NY state right now. Law requires it.
  13. NCLB required everyone to get above proficient – message focus on kids at or near proficientSchool systems respondedMS standards are harder than the elem standards – MS problemNo effort to calibrate them – no effort to project elem to ms standardsStart easy and ramp up.Proficient in elem and not in MS with normal growth. When you control for the difficulty in the standards Elem and MS performance are the same
  14. Not only are standards different across grades, they are different across states.It’s data like this that helps to inspire the Common Core and consistent standards so we compare apples to apples
  15. Dramatic differences between standards based vs growthKY 5th grade mathematicsSample of students from a large school systemX-axis Fall score, Y number of kidsBlue are the kids who did not change status between the fall and the spring on the state testRed are the kids who declined in performance over spring – DecenderGreen are kids who moved above it in performance over the spring – Ascender – Bubble kidsAbout 10% based on the total number of kidsAccountability plans are made typically based on these red and green kids
  16. Same district as beforeYellow – did not meet target growth – spread over the entire range of kidsGreen – did meet growth targets60% vs 40% is doing well – This is a high performing district with high growthMust attend to all kids – this is a good thing – ones in the middle and at both extremesOld one was discriminatory – focus on some in lieu of othersTeachers who teach really hard at the standard for years – Teachers need to be able to reach them allThis does a lot to move the accountability system to parents and our desires.
  17. Steps are quite important. People tend to skip some of these.Kids take a test – important that the test is aligned to instruction being givenMetric – look at growth vs growth norm and calculate a growth index. Two benefits – Very transparent/Simple.People tend to use our growth norms – if you hit 60% for a grade level within a school you are dong well.Norms – growth of a kid or group of kids compared to a nationally representative sample of studentsWhy isn’t this value added?Not all teachers can be compared to a nationally representative sample because they don’t teach kids that are just like the national sampleThe third step controls for variables unique to the teacher’s classroom or environmentFourth step – rating – how much below average before the district takes action or how much above before someone gets performance pay. Particular challenge in NY state right now. Law requires it.
  18. Close by noting that NWEA recognized the need for this level of precision when trying to understand student performance (and by extension, teacher performance). This is why, in NY (where we first began having this conversation with partners), we sought to partner with VARC, because of their background and experience providing these services (and because this is something that we did not want to do, even if we had the background/experience).Talk about the number of districts and students in 11-12 and 12-13, to provide context for the ability for this to be done on a broad scale.
  19. There are wonderful teachers who teach in very challenging, dysfunctional settings. The setting can impact the growth. HLM embeds the student in a classroom, the classroom in the school, and controls for the school parameters. Is it perfect. No. Is it better? Yes.Opposite is true and learning can be magnified as well.What if kids are a challenge, ESL or attendance for instance. It can deflate scores especially with a low number of kids in the sample being analyzed. Also need to make sure you have a large enough ‘n’ to make this possible especially true in small districts.Our position is that a test can inform the decision, but the principal/administrator should collect the bulk of the data that is used in the performance evaluation process.
  20. Experts recommend multiple years of data to do the evaluation. Invalid to just use two points and will testify to it.Principals never fire anyone – NY rubber room – mythIf they do, it’s not fast enough. – Need to speed up the processThis won’t make the process faster – Principals doing intense evaluations will
  21. Measurement error is compounded in test 1 and test 2
  22. Green line is their VA estimate and bar is the error of measureBoth on top and bottom people can be in other quartilesPeople in the middle can cross quintiles – just based on SEMCross country – winners spread out. End of the race spread. Middle you get a pack. Middle moving up makes a big difference in the overall race.Instability and narrowness of ranges means evaluating teachers in the middle of the test mean slight changes in performance can be a large change in performance ranking
  23. Non –random assignments Models control for various things – FRL, ethnicity, school effectiveness overall. Beyond this point assignment is random.1st year teachers get more discipline problems than teachers who have been 30 years. Pick the kids they get. If the model doesn’t control for disciplinary record – none do have that data – scores are inflated. Makes model invalid.Principals do need to do non-random assignment – sound educational reasons for the placement – match adults for kids
  24. Steps are quite important. People tend to skip some of these.Kids take a test – important that the test is aligned to instruction being givenMetric – look at growth vs growth norm and calculate a growth index. Two benefits – Very transparent/Simple.People tend to use our growth norms – if you hit 60% for a grade level within a school you are dong well.Norms – growth of a kid or group of kids compared to a nationally representative sample of studentsWhy isn’t this value added?Not all teachers can be compared to a nationally representative sample because they don’t teach kids that are just like the national sampleThe third step controls for variables unique to the teacher’s classroom or environmentFourth step – rating – how much below average before the district takes action or how much above before someone gets performance pay. Particular challenge in NY state right now. Law requires it.
  25. Use NY point system as the example