SlideShare a Scribd company logo
1 of 68
Data Quality Challenges When
Implementing Classroom Value-
        Added Models
                 Chris Thorn
                Sara Kraemer
                 Jeff Watson

Center for Data Quality and Systems Innovation
         Value-Added Research Center
                 UW Madison
Outline
•   Overview VARC/CDQSI
•   Overview Value-Added methods
•   Student Teacher Linkage
•   Threats to validity for Classroom Value-Added
    – Student teacher linkage data quality
    – Assignment and matching of students and
      teachers
VARC and CDQSI Overview
• Value-Added Research Center
   Director – Rob Meyer
• Center for Data Quality and Systems Innovation
   Director – Chris Thorn
• Areas of work
   –   Data systems / Data quality / Decision support
   –   Professional development
   –   Reporting systems
   –   Evaluation
   –   Mixed methods
• Places where we work
Districts and States working with VARC


               Minneapolis             Milwaukee
                        Madison          Racine
                             Chicago                        New York City

                                                            US Dept. of Ed.

 Los Angeles             Tulsa
                                                  Atlanta



                                       Hillsborough
                                          County
                                                               Broward
                                                                County
What is Value-Added Anyways?
How do we measure
           student performance?
• How do we do this?

    (example: the current NCLB method… percent proficient)

    (example: Colorado Growth Model)

    (example: VARC’s Value-Added Analysis)


• The following non-education example tries to
  illustrate the difference between these measures.
The Oak Tree Analogy
Explaining the concept of Value-Added
                by evaluating the performance of two gardeners
• For the past year, these gardeners have been tending to their oak trees trying to maximize
the height of the trees.

• Each gardener used a variety of strategies to help their own tree grow… which of these
two gardeners was more successful with their strategies?
To measure the performance of the gardeners, we will measure
the height of the trees today (1 year after they began tending to the trees).

• Using this method, Gardener B is the superior gardener.

             This method is analogous to using an Attainment Model.
… but this attainment result does not tell the whole story.

• These trees are 4 years old.

• We need to find the starting height for each tree in order to more fairly evaluate each
gardener’s performance during the past year.

• The trees were much shorter last year.




                                Oak A                            Oak B
                                Age 4
                                    3                            Age 4
                                                                     3
                            (1 (Today)
                               year ago)                     (1 (Today)
                                                                year ago)
We can compare the height of the trees one year ago to the height today.
• By finding the difference between these heights, we can determine how many inches the
trees grew during the year of gardener’s care.

• Oak B had more growth this year, so Gardener B is the superior gardener.

          This is analogous to a Simple Growth Model, also called Gain.
… but this simple growth result does not tell the whole story either.
• We do not yet know how much of this growth was due to the strategies used by the
gardeners themselves.
• This is an “apples to oranges” comparison.
• For our oak tree example, three environmental factors we will examine are:
 Rainfall, Soil Richness, and Temperature.
External condition   Oak Tree A   Oak Tree B
Rainfall amount        High          Low
 Soil richness         Low           High
  Temperature          High          Low
How much the gardeners’ own strategies contributed to the growth of the trees…
    • We can take out each environmental factor’s contribution to growth.

    • After these external factors are accounted for, we will be left with the effect of just the
    gardeners.

    • To find the correct adjustments, we will analyze data from all oaks in the region.
In order to find the impact of rainfall, soil richness, and temperature, we will plot the
growth of each individual oak in the region compared to its environmental conditions.
Now that we have identified growth trends for each of these environmental
  factors, we need to convert them into a form usable for our calculations.


        Rainfall           Low            Medium             High
     Growth in inches
      relative to the       -5               -2               +3
         average


     Soil Richness         Low            Medium             High
     Growth in inches
      relative to the       -3               -1               +2
         average


     Temperature           Low            Medium             High
     Growth in inches
      relative to the       +5               -3               -8
         average

Now we can go back to Oak A and Oak B to adjust for their growing conditions.
To calculate our new adjusted growth, we start with simple growth.

• Next, we will use our numerical adjustments to account for the effect of each tree’s
environmental conditions.
• When we are done, we will have an “apples to apples” comparison of the gardeners’
influence on growth.




+14 Simple                                                                    +20 Simple
   Growth                                                                        Growth
Based on data for all oak trees in the region, we found that high rainfall resulted in
                        3 inches of extra growth on average.

     For having high rainfall, Oak A’s growth is adjusted by -3 to compensate.

Similarly, for having low rainfall, Oak B’s growth is adjusted by +5 to compensate.




+14 Simple                                                             +20 Simple
- 3 for Rainfall                                                       + 5 for Rainfall
For having poor soil, Oak A’s growth is adjusted by +3 to compensate.

       For having rich soil, Oak B’s growth is adjusted by -2 to compensate.




+14 Simple                                                           +20 Simple
 - 3 for Rainfall                                                    + 5 for Rainfall
+ 3 for Soil                                                          - 2 for Soil
For having high temperature, Oak A’s growth is adjusted by +8 to compensate.

   For having low temperature, Oak B’s growth is adjusted by -5 to compensate.




+14 Simple                                                         +20 Simple
 - 3 for Rainfall                                                  + 5 for Rainfall
+ 3 for Soil                                                        - 2 for Soil
+ 8 for Temp                                                        - 5 for Temp
Now that we have removed the effect of environmental conditions, our adjusted
          growth result puts the gardeners on a level playing field.

             We calculate that Gardener A’s effect on Oak A is +22 inches

             We calculate that Gardener B’s effect on Oak B is +18 inches




+14 Simple                                                            +20 Simple
 - 3 for Rainfall                                                     + 5 for Rainfall
+ 3 for Soil                                                           - 2 for Soil
+ 8 for Temp                                                           - 5 for Temp
_________                                                                _________
+22 inches                                                              +18 inches
Adjusted Growth                                                   Adjusted Growth
Using this method, Gardener A is the superior gardener.

By accounting for last year’s height and environmental conditions of the trees during
  this year, we found the “value” each gardener “added” to the growth of the tree.

                     This is analogous to a Value-Added Model.




 +14 Simple                                                           +20 Simple
  - 3 for Rainfall                                                   + 5 for Rainfall
 + 3 for Soil                                                         - 2 for Soil
 + 8 for Temp                                                         - 5 for Temp
 _________                                                              _________
 +22 inches                                                            +18 inches
 Adjusted Growth                                                 Adjusted Growth
How does this analogy relate to Value-Added calculations in the education context?
                        Oak Tree Analogy        Value-Added in Education
Level of analysis?      • Gardeners             • IHE
                                                • Districts/Schools
                                                • Grades
                                                • Classrooms
                                                • Programs and Interventions
What are we using to    • Growth in Inches      • Relative Growth in Scale Score
measure success?                                Points

Sample                  • Single Oak Tree       • Groups of Students

Control Factors         • Rainfall              • Students’ Prior Performance
                        • Soil Richness         (most significant predictor)
                        • Temperature
                                                Potential other variables collected
                                                for ALL students

                                                • Free/Reduced Lunch Status
                                                • English Language Learner Status
                                                • IEP / Special Education Status
                                                • Race
                                                • Gender
Where is Classroom-level
         Value-Added Used?
• Recognizing Effective Practice
• Providing Support
• Performance Pay
• Tenure Decisions
• Higher Education Evaluation of Teacher
  Preparation
• Performance Management
Overview of Data Requirements for
               Classroom VAA
•   Student demographics
•   Teacher roster with background and accreditation
•   School list and information
•   Student and teacher attendance
•   Course enrollment with assigned teachers
•   Achievement data
•   Mobility data
Simple is good….




                   ….unless its wrong.
Defining Student Teacher Linkages:
      Who is Teaching Who?
                        School
              Student            Teacher


                        Course
     Outcomes:                          Practice:
Formative assessment,                  Pedagogy,
    Achievement,                      Curriculum,
     Graduation,                       Tutoring,
      Behavior,                      Coaching, PD,
   Post-graduation                    Scheduling,
       success                          Class size
Warning: Reality Approaching
               School
     Student            Teacher


               Course
… movement between schools
              School




    Student        School   Teacher
… and movement within schools
               School
     Student               Teacher
               Course
                Course
                  Course
… and complicated workflows
                                   School enrollment
Schedule may
                                      is estimated
change based
on completion
                          School
                Student              Staffing needs
                                     are estimated

                                      Determined
                                   sometime between
                          Course      Feb. and July


                                                          Maybe
                                                       hired in Sept.

                                          Teacher
… and absence rates
                      School
            Student            Teacher
Attendance rate,                   Extended leave,
    discipline                     moved to cover,
   infractions        Course          absences?
… Instructional strategies like grouping,
 pull-outs, room aides, team teaching
                   School
        Student               Teacher


                            Instructional
                   Course   Support staff


                             Specialists
… non-traditional schools
               School
                             Non-district
   Student                    employed
                              Teachers


               Course

                Unique
             instructional
              approaches
… data errors, integration issues, SIS work
       arounds, multiple districts
                                     School                       SIS
             Student
              Student                                  Teacher
                                                        Teacher
  Matching
  Records
                                 Course                           HR

                        Data entry         Creative
                        problems          scheduling
Ouch
  Course
Scheduling

   Attendance rate,
                                            School                             SIS
       discipline
      infractions   Student                                 Teacher        Extended leave,
                     Student        Student                  Teacher
                                                                           moved to cover,
      Matching                     Enrollment                                 absences?
      Records                                              Instructional
                                        Course                                 HR
                                                           support staff
                               Data entry         Creative                    HIRING
                               problems          scheduling Specialists       workflow

                                         Unique                            School staffing
                                      instructional                         adjustments
                                       approaches
                 Mobility
                 School
Challenges for Classroom Level Value-
                    Added
• Threats to Validity
     – Student teacher linkages
     – Assignment of students and teachers
•   Mobility
•   Assessment Quality
•   Assessment Coverage
•   Integration
ST Linkages: Harvesting and Validating
                    Jeff Watson
                    Chris Thorn
                   Peter Witham
                 Timothy St. Louis

   Center for Data Quality and Systems Innovation

           Value-Added Research Center

                   UW Madison
2 ST Linkage Data Quality Studies
• On-Site validation of ST Linkage data in a large
  urban district

• Review of TIF districts processes for managing
  and verifying ST Linkage data
Validation Study
Goals:

  1. Determine how good / bad ST linkage data is
  2. Correct bad data
  3. Collect new data (e.g., team teaching)
Selecting a Sample
• Sample method goal is to be confident that estimates
  of error rates are relatively accurate
   – Sample first to assess DQ levels
      • Target a small number of schools so that DQ error rates can be
        estimated more accurately


• Census method goal is to confirm and adjust SIS ST
  linkages to improve the validity of classroom VA

   – Expanded sample as needed
      • Include all schools within program
   – Staged verification
Determine Sample Size

                                How to estimate sigma?

                     Estimate of σ (in
                     percent points) ±2.5%    ±5.0%
                     10                    64       16
                     15                   144       36
                     20                   256       64
                     25                   400      100
                     30                   576      144


Based on approach for determining samples size for a targeted confidence interval from
Wackerly, Mendenhall, Scheaffer (1996)
Target = ?? to ensure 100 Classrooms
• N=100
• Target more classrooms so that the resulting
  sample in at least 100
• 75% participation rate
• 50% content rate
• Starting Sample = 100 / .75 /.5
                    = 267 classrooms
  resulted in us recruiting about 9-10 urban
  schools, mixed size and grade levels.
Constraints
• Control for mobility elsewhere
  – Student
  – Teacher
• Focus on grades and courses that are aligned
  with the assessments and VA analysis
  – 3-8, Math/Reading
  – Formative assessments
  – End of course assessments
  – College readiness assessments
Process for Validation
• Collect data from district administration:
   – Initial data files (enrollment, entity data)
   – Course content case logic
• Collect data from principals:
   – School – teacher assignments
   – Teacher – course linkages
• Collect data from teachers:
   – Teacher – course assignments
   – Student-course assignments
   – Team teaching
So…. Who is Teaching Who?


        Accuracy Rate    74.2%
 Team Taught Students    15.7%
    Incorrect Students    2.5%
     Incorrect Courses    7.6%
                TOTAL    100%
Data from 9 schools in 1 urban district
100.0%

 90.0%

 80.0%

 70.0%

 60.0%

 50.0%                       Accuracy Rate
 40.0%                       Team Taught Students
 30.0%                       Incorrect Students
 20.0%                       Incorrect Courses
 10.0%

  0.0%




         A B C D E F G H I
Counts of Student – Teacher Errors
                                     No Add. Pull-out Push-in Targeted
                                     Services Services Services Services NULL         Grand Total


                    Traditional          7952       57       0       26           0           8035
Type of Classroom




                    Team Taught          1150      126      25      297           0           1598

                    “Was not in
                    class”                 10       92       2        0         163            267

                    “Did not teach
                    class”                  0        0       0        0         817            817


                    Grand Total          9112      275      27      323         980         10717
Summary of Findings
• Validated 74% of the time
• Adjusted 16% of the time
• Wrong 10% of the time

• Schools’ ST Linkage data varied between 55% and 93%
  accurate

• Inaccurate records were the result of:
   – not knowing about team teaching (4-30%)
   – inaccurate teacher – course linkages (0-31%)
   – inaccurate student – course linkages (0-15%)
So How Do Performance Pay Projects
   Manage ST Linkage Data Quality?
1. How do TIF grantees obtain student-teacher
   linkage data?

2. How do TIF grantees validate student-teacher
   linkage data?

3. Are there any lessons to be learned?
Process Flow

      Roster
        Roster
           Roster
                         Census   Performance
SIS            Rosters   Verify    Estimation




                                  $$Payout$$
Process Flow

      Roster
        Roster
           Roster
                         Census   Performance
SIS            Rosters   Verify    Estimation




           Data
         Warehouse                $$Payout$$
Managing Complexity

 Courses: Multiple subject areas, Project-based
  learning

 Teachers: Team teaching, Cross assignments

 Students: Mobility, Absenteeism
Enhancing Performance Pay Programs
• Verification: Critical and Strategic
  – Accuracy vetted
  – Enhanced stakeholder buy-in
  – Opportunity to capture nuance
     • Changes in time
     • Team teaching
Leverage Personnel for Different Purposes.


  – When / where to involve district staff

  – When/where to involve principals

  – When/where to involve teachers
Organization Design

• TIF as a Business Driver (not compliance
  focused; high stakes DSS)
• Multi-level connections
• Technical staff involvement
• Focus on managing teacher data
• Implementation → Challenge → Innovation
External partners can increase both
          capacity and overhead.
• SIS vendors, Database mgmt., Verification
  processes and tools
• Positive consequences:
  – Increased capacity, specialized and expert
    knowledge, external agency
• Negative consequences:
  – Vendor silos, more project mgmt. and
    communication overhead for the grantee
Summary
• Classroom Value-Added requires high-quality
  student – teacher linkage data
• SIS Student – teacher linkage data may not
  capture certain things well, like team teaching
• Verification can be staged
• Verification improves stakeholder trust and
  support as well as improving the validity of VA
  estimates and the decisions those estimates
  inform
Extending the Validity and
       Interpretation of
    Value-Added Data

Classroom Assignment and Matching
 Sara Kraemer, Robert H. Meyer, and
           Robin Worth
Assignment and Value-Added
• To what extent are teachers systematically
  assigned to certain kinds of students?
  – Rothstein Critique
• What are the unique attributes of students
  and teachers?
  – Can we make them “observable” in the data?
• What is the process of assignment in schools?
Student Assignment & Classroom Value-
                Added
• Classroom assignment and student-teacher
  matching (validity and accuracy of VA)
  – Implications for student-teacher linkages
  – Sociotechnical analysis of school processes
• Sources to inform our analysis:
  – Principals and former principals from 3 large
    urban school districts
     • 2 districts with performance compensation systems.
Classrooms Design and Attributes
• Purposive classroom assignment
   –   Heterogeneous (mix ability levels, demographics, etc…)
   –   Homogeneous (sort on ability levels)
   –   Vertical and horizontal alignment
   –   Ability grouping
   –   Split classrooms, looping, team teaching
   –   Other cases: Mobility, special education students
• Matching: Attributes
   – Teachers
        • Classroom management skills, personality, etc…
   – Student
        • Behavior, special cases, etc…
• Social and technical considerations
   – Value-added
   – Linking students to teachers
   – Organizational design and process of assignment and matching
Value added and assignment/matching
• Extent of unbalanced classrooms?
  – Ex: Teachers w/ strong classroom management
    skills
• Heterogeneous assignment perceived as more
  “fair”
• Implications for measuring “true” teacher
  effectiveness
• Some characteristics not measured in VA
  model
Social and organizational factors in
            assignment/matching - 1
• Process of assignment and matching in schools
   –   Principal led
   –   Teams of teachers (within and across grades)
   –   School leader/teacher/principal hybrid team
   –   Collaborative versus non-collaborative model
   –   Timing: End of academic year
• Input factors
   – Test scores, input from teachers/school leaders, principal
     observation, parent input
• Environmental issues
   – Teacher labor issues (turnover/retention, shortages,
     teacher allocation)
Social and organizational factors in
            assignment/matching - 2
• Contextual factors
   –   Levels of complexity
   –   Size of school, number of grades served
   –   Breadth and depths of courses offered
   –   School types
        • Elementary versus high school, charters
   – Demographics of student, teacher population
   – Degrees of community and parental involvement
   – Mobility of students across classrooms or/and schools
Assignment and Matching:
     Linking Students to Teachers in IS
• Verification of student-teacher linkages in data
• Information systems need capture reality of
  classrooms
• Assignment and matching models for
  explanatory power in linkage data
Next steps
• How can data systems support classroom
  assignment and value-added validity?
  – Assessment of school assignment practices within
    a district (survey development)
  – System requirements for data quality validation of
    student-teacher linkages
  – Data systems or other tools for school-level
    decision support
  – Reporting to schools, includes assignment
    assessment
Questions?

More Related Content

More from Christopher Thorn

2010 nces management information systems conference when world collide
2010 nces  management information systems conference   when world collide2010 nces  management information systems conference   when world collide
2010 nces management information systems conference when world collide
Christopher Thorn
 
2010 eagle co technology presentation
2010 eagle co technology presentation2010 eagle co technology presentation
2010 eagle co technology presentation
Christopher Thorn
 
2010 bush foundation inputs, process indicators, and intermediate outcomes
2010 bush foundation inputs, process indicators, and intermediate outcomes2010 bush foundation inputs, process indicators, and intermediate outcomes
2010 bush foundation inputs, process indicators, and intermediate outcomes
Christopher Thorn
 
2010 uw-madison - systems thinking and it leadership
2010   uw-madison - systems thinking and it leadership2010   uw-madison - systems thinking and it leadership
2010 uw-madison - systems thinking and it leadership
Christopher Thorn
 
2008 regional educational laboratory board of directors (rel midwest)
2008 regional educational laboratory board of directors (rel midwest) 2008 regional educational laboratory board of directors (rel midwest)
2008 regional educational laboratory board of directors (rel midwest)
Christopher Thorn
 
2008 nces summer data conference making the case for longitudinal data of t...
2008 nces summer data conference   making the case for longitudinal data of t...2008 nces summer data conference   making the case for longitudinal data of t...
2008 nces summer data conference making the case for longitudinal data of t...
Christopher Thorn
 
2008 cecr annual meeting linking data systems to local reforms
2008 cecr annual meeting linking data systems to local reforms2008 cecr annual meeting linking data systems to local reforms
2008 cecr annual meeting linking data systems to local reforms
Christopher Thorn
 
2006 tuebingen university wcer capabilities - grenzübergänger
2006 tuebingen university   wcer capabilities - grenzübergänger2006 tuebingen university   wcer capabilities - grenzübergänger
2006 tuebingen university wcer capabilities - grenzübergänger
Christopher Thorn
 
2006 qualitative strategies conference uni durham - are we there yet - grow...
2006 qualitative strategies conference   uni durham - are we there yet - grow...2006 qualitative strategies conference   uni durham - are we there yet - grow...
2006 qualitative strategies conference uni durham - are we there yet - grow...
Christopher Thorn
 
2005 studying qualitative research expert to novice and back again - uni durham
2005 studying qualitative research expert to novice and back again - uni durham 2005 studying qualitative research expert to novice and back again - uni durham
2005 studying qualitative research expert to novice and back again - uni durham
Christopher Thorn
 
2005 scale math and science partnership research and evaluation team mis and...
2005 scale math and science partnership  research and evaluation team mis and...2005 scale math and science partnership  research and evaluation team mis and...
2005 scale math and science partnership research and evaluation team mis and...
Christopher Thorn
 
2004 national science foundation knowledge management and network collaborati...
2004 national science foundation knowledge management and network collaborati...2004 national science foundation knowledge management and network collaborati...
2004 national science foundation knowledge management and network collaborati...
Christopher Thorn
 
2002 it in educational management organic school info-systems-decision making...
2002 it in educational management organic school info-systems-decision making...2002 it in educational management organic school info-systems-decision making...
2002 it in educational management organic school info-systems-decision making...
Christopher Thorn
 
2002 aera making decision support systems useful in the classroom
2002 aera making decision support systems useful in the classroom2002 aera making decision support systems useful in the classroom
2002 aera making decision support systems useful in the classroom
Christopher Thorn
 
2001 wisconsin school boards association issues in building data and inform...
2001 wisconsin school boards association   issues in building data and inform...2001 wisconsin school boards association   issues in building data and inform...
2001 wisconsin school boards association issues in building data and inform...
Christopher Thorn
 
2000 nces management info systems conference knowledge management for educa...
2000 nces management info systems conference   knowledge management for educa...2000 nces management info systems conference   knowledge management for educa...
2000 nces management info systems conference knowledge management for educa...
Christopher Thorn
 
2012 tif3 measuring principal effectiveness data capacity and quality consi...
2012 tif3 measuring principal effectiveness   data capacity and quality consi...2012 tif3 measuring principal effectiveness   data capacity and quality consi...
2012 tif3 measuring principal effectiveness data capacity and quality consi...
Christopher Thorn
 

More from Christopher Thorn (18)

2010 nces management information systems conference when world collide
2010 nces  management information systems conference   when world collide2010 nces  management information systems conference   when world collide
2010 nces management information systems conference when world collide
 
2010 eagle co technology presentation
2010 eagle co technology presentation2010 eagle co technology presentation
2010 eagle co technology presentation
 
2010 bush foundation inputs, process indicators, and intermediate outcomes
2010 bush foundation inputs, process indicators, and intermediate outcomes2010 bush foundation inputs, process indicators, and intermediate outcomes
2010 bush foundation inputs, process indicators, and intermediate outcomes
 
2010 uw-madison - systems thinking and it leadership
2010   uw-madison - systems thinking and it leadership2010   uw-madison - systems thinking and it leadership
2010 uw-madison - systems thinking and it leadership
 
2008 regional educational laboratory board of directors (rel midwest)
2008 regional educational laboratory board of directors (rel midwest) 2008 regional educational laboratory board of directors (rel midwest)
2008 regional educational laboratory board of directors (rel midwest)
 
2008 nces summer data conference making the case for longitudinal data of t...
2008 nces summer data conference   making the case for longitudinal data of t...2008 nces summer data conference   making the case for longitudinal data of t...
2008 nces summer data conference making the case for longitudinal data of t...
 
2008 cecr annual meeting linking data systems to local reforms
2008 cecr annual meeting linking data systems to local reforms2008 cecr annual meeting linking data systems to local reforms
2008 cecr annual meeting linking data systems to local reforms
 
2006 tuebingen university wcer capabilities - grenzübergänger
2006 tuebingen university   wcer capabilities - grenzübergänger2006 tuebingen university   wcer capabilities - grenzübergänger
2006 tuebingen university wcer capabilities - grenzübergänger
 
2006 qualitative strategies conference uni durham - are we there yet - grow...
2006 qualitative strategies conference   uni durham - are we there yet - grow...2006 qualitative strategies conference   uni durham - are we there yet - grow...
2006 qualitative strategies conference uni durham - are we there yet - grow...
 
2005 studying qualitative research expert to novice and back again - uni durham
2005 studying qualitative research expert to novice and back again - uni durham 2005 studying qualitative research expert to novice and back again - uni durham
2005 studying qualitative research expert to novice and back again - uni durham
 
2005 scale math and science partnership research and evaluation team mis and...
2005 scale math and science partnership  research and evaluation team mis and...2005 scale math and science partnership  research and evaluation team mis and...
2005 scale math and science partnership research and evaluation team mis and...
 
2004 national science foundation knowledge management and network collaborati...
2004 national science foundation knowledge management and network collaborati...2004 national science foundation knowledge management and network collaborati...
2004 national science foundation knowledge management and network collaborati...
 
2002 it in educational management organic school info-systems-decision making...
2002 it in educational management organic school info-systems-decision making...2002 it in educational management organic school info-systems-decision making...
2002 it in educational management organic school info-systems-decision making...
 
2002 aera making decision support systems useful in the classroom
2002 aera making decision support systems useful in the classroom2002 aera making decision support systems useful in the classroom
2002 aera making decision support systems useful in the classroom
 
2001 wisconsin school boards association issues in building data and inform...
2001 wisconsin school boards association   issues in building data and inform...2001 wisconsin school boards association   issues in building data and inform...
2001 wisconsin school boards association issues in building data and inform...
 
2000 nces management info systems conference knowledge management for educa...
2000 nces management info systems conference   knowledge management for educa...2000 nces management info systems conference   knowledge management for educa...
2000 nces management info systems conference knowledge management for educa...
 
2012 tif3 measuring principal effectiveness data capacity and quality consi...
2012 tif3 measuring principal effectiveness   data capacity and quality consi...2012 tif3 measuring principal effectiveness   data capacity and quality consi...
2012 tif3 measuring principal effectiveness data capacity and quality consi...
 
Nces 2008 Presentation
Nces 2008 PresentationNces 2008 Presentation
Nces 2008 Presentation
 

Recently uploaded

Spellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please PractiseSpellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please Practise
AnaAcapella
 
Salient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functionsSalient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functions
KarakKing
 

Recently uploaded (20)

Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 
SOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning PresentationSOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning Presentation
 
On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan Fellows
 
Spatium Project Simulation student brief
Spatium Project Simulation student briefSpatium Project Simulation student brief
Spatium Project Simulation student brief
 
ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.
 
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptxSKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
 
Understanding Accommodations and Modifications
Understanding  Accommodations and ModificationsUnderstanding  Accommodations and Modifications
Understanding Accommodations and Modifications
 
Dyslexia AI Workshop for Slideshare.pptx
Dyslexia AI Workshop for Slideshare.pptxDyslexia AI Workshop for Slideshare.pptx
Dyslexia AI Workshop for Slideshare.pptx
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
Fostering Friendships - Enhancing Social Bonds in the Classroom
Fostering Friendships - Enhancing Social Bonds  in the ClassroomFostering Friendships - Enhancing Social Bonds  in the Classroom
Fostering Friendships - Enhancing Social Bonds in the Classroom
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 
Spellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please PractiseSpellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please Practise
 
Towards a code of practice for AI in AT.pptx
Towards a code of practice for AI in AT.pptxTowards a code of practice for AI in AT.pptx
Towards a code of practice for AI in AT.pptx
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 
Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17
 
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptxHMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
 
Making communications land - Are they received and understood as intended? we...
Making communications land - Are they received and understood as intended? we...Making communications land - Are they received and understood as intended? we...
Making communications land - Are they received and understood as intended? we...
 
Salient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functionsSalient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functions
 

2011 data quality challenges when implementing classroom value added models

  • 1. Data Quality Challenges When Implementing Classroom Value- Added Models Chris Thorn Sara Kraemer Jeff Watson Center for Data Quality and Systems Innovation Value-Added Research Center UW Madison
  • 2. Outline • Overview VARC/CDQSI • Overview Value-Added methods • Student Teacher Linkage • Threats to validity for Classroom Value-Added – Student teacher linkage data quality – Assignment and matching of students and teachers
  • 3. VARC and CDQSI Overview • Value-Added Research Center Director – Rob Meyer • Center for Data Quality and Systems Innovation Director – Chris Thorn • Areas of work – Data systems / Data quality / Decision support – Professional development – Reporting systems – Evaluation – Mixed methods • Places where we work
  • 4. Districts and States working with VARC Minneapolis Milwaukee Madison Racine Chicago New York City US Dept. of Ed. Los Angeles Tulsa Atlanta Hillsborough County Broward County
  • 6. How do we measure student performance? • How do we do this? (example: the current NCLB method… percent proficient) (example: Colorado Growth Model) (example: VARC’s Value-Added Analysis) • The following non-education example tries to illustrate the difference between these measures.
  • 7. The Oak Tree Analogy
  • 8. Explaining the concept of Value-Added by evaluating the performance of two gardeners • For the past year, these gardeners have been tending to their oak trees trying to maximize the height of the trees. • Each gardener used a variety of strategies to help their own tree grow… which of these two gardeners was more successful with their strategies?
  • 9. To measure the performance of the gardeners, we will measure the height of the trees today (1 year after they began tending to the trees). • Using this method, Gardener B is the superior gardener. This method is analogous to using an Attainment Model.
  • 10. … but this attainment result does not tell the whole story. • These trees are 4 years old. • We need to find the starting height for each tree in order to more fairly evaluate each gardener’s performance during the past year. • The trees were much shorter last year. Oak A Oak B Age 4 3 Age 4 3 (1 (Today) year ago) (1 (Today) year ago)
  • 11. We can compare the height of the trees one year ago to the height today. • By finding the difference between these heights, we can determine how many inches the trees grew during the year of gardener’s care. • Oak B had more growth this year, so Gardener B is the superior gardener. This is analogous to a Simple Growth Model, also called Gain.
  • 12. … but this simple growth result does not tell the whole story either. • We do not yet know how much of this growth was due to the strategies used by the gardeners themselves. • This is an “apples to oranges” comparison. • For our oak tree example, three environmental factors we will examine are: Rainfall, Soil Richness, and Temperature.
  • 13. External condition Oak Tree A Oak Tree B Rainfall amount High Low Soil richness Low High Temperature High Low
  • 14. How much the gardeners’ own strategies contributed to the growth of the trees… • We can take out each environmental factor’s contribution to growth. • After these external factors are accounted for, we will be left with the effect of just the gardeners. • To find the correct adjustments, we will analyze data from all oaks in the region.
  • 15. In order to find the impact of rainfall, soil richness, and temperature, we will plot the growth of each individual oak in the region compared to its environmental conditions.
  • 16. Now that we have identified growth trends for each of these environmental factors, we need to convert them into a form usable for our calculations. Rainfall Low Medium High Growth in inches relative to the -5 -2 +3 average Soil Richness Low Medium High Growth in inches relative to the -3 -1 +2 average Temperature Low Medium High Growth in inches relative to the +5 -3 -8 average Now we can go back to Oak A and Oak B to adjust for their growing conditions.
  • 17. To calculate our new adjusted growth, we start with simple growth. • Next, we will use our numerical adjustments to account for the effect of each tree’s environmental conditions. • When we are done, we will have an “apples to apples” comparison of the gardeners’ influence on growth. +14 Simple +20 Simple Growth Growth
  • 18. Based on data for all oak trees in the region, we found that high rainfall resulted in 3 inches of extra growth on average. For having high rainfall, Oak A’s growth is adjusted by -3 to compensate. Similarly, for having low rainfall, Oak B’s growth is adjusted by +5 to compensate. +14 Simple +20 Simple - 3 for Rainfall + 5 for Rainfall
  • 19. For having poor soil, Oak A’s growth is adjusted by +3 to compensate. For having rich soil, Oak B’s growth is adjusted by -2 to compensate. +14 Simple +20 Simple - 3 for Rainfall + 5 for Rainfall + 3 for Soil - 2 for Soil
  • 20. For having high temperature, Oak A’s growth is adjusted by +8 to compensate. For having low temperature, Oak B’s growth is adjusted by -5 to compensate. +14 Simple +20 Simple - 3 for Rainfall + 5 for Rainfall + 3 for Soil - 2 for Soil + 8 for Temp - 5 for Temp
  • 21. Now that we have removed the effect of environmental conditions, our adjusted growth result puts the gardeners on a level playing field. We calculate that Gardener A’s effect on Oak A is +22 inches We calculate that Gardener B’s effect on Oak B is +18 inches +14 Simple +20 Simple - 3 for Rainfall + 5 for Rainfall + 3 for Soil - 2 for Soil + 8 for Temp - 5 for Temp _________ _________ +22 inches +18 inches Adjusted Growth Adjusted Growth
  • 22. Using this method, Gardener A is the superior gardener. By accounting for last year’s height and environmental conditions of the trees during this year, we found the “value” each gardener “added” to the growth of the tree. This is analogous to a Value-Added Model. +14 Simple +20 Simple - 3 for Rainfall + 5 for Rainfall + 3 for Soil - 2 for Soil + 8 for Temp - 5 for Temp _________ _________ +22 inches +18 inches Adjusted Growth Adjusted Growth
  • 23. How does this analogy relate to Value-Added calculations in the education context? Oak Tree Analogy Value-Added in Education Level of analysis? • Gardeners • IHE • Districts/Schools • Grades • Classrooms • Programs and Interventions What are we using to • Growth in Inches • Relative Growth in Scale Score measure success? Points Sample • Single Oak Tree • Groups of Students Control Factors • Rainfall • Students’ Prior Performance • Soil Richness (most significant predictor) • Temperature Potential other variables collected for ALL students • Free/Reduced Lunch Status • English Language Learner Status • IEP / Special Education Status • Race • Gender
  • 24. Where is Classroom-level Value-Added Used? • Recognizing Effective Practice • Providing Support • Performance Pay • Tenure Decisions • Higher Education Evaluation of Teacher Preparation • Performance Management
  • 25. Overview of Data Requirements for Classroom VAA • Student demographics • Teacher roster with background and accreditation • School list and information • Student and teacher attendance • Course enrollment with assigned teachers • Achievement data • Mobility data
  • 26. Simple is good…. ….unless its wrong.
  • 27. Defining Student Teacher Linkages: Who is Teaching Who? School Student Teacher Course Outcomes: Practice: Formative assessment, Pedagogy, Achievement, Curriculum, Graduation, Tutoring, Behavior, Coaching, PD, Post-graduation Scheduling, success Class size
  • 28. Warning: Reality Approaching School Student Teacher Course
  • 29. … movement between schools School Student School Teacher
  • 30. … and movement within schools School Student Teacher Course Course Course
  • 31. … and complicated workflows School enrollment Schedule may is estimated change based on completion School Student Staffing needs are estimated Determined sometime between Course Feb. and July Maybe hired in Sept. Teacher
  • 32. … and absence rates School Student Teacher Attendance rate, Extended leave, discipline moved to cover, infractions Course absences?
  • 33. … Instructional strategies like grouping, pull-outs, room aides, team teaching School Student Teacher Instructional Course Support staff Specialists
  • 34. … non-traditional schools School Non-district Student employed Teachers Course Unique instructional approaches
  • 35. … data errors, integration issues, SIS work arounds, multiple districts School SIS Student Student Teacher Teacher Matching Records Course HR Data entry Creative problems scheduling
  • 36. Ouch Course Scheduling Attendance rate, School SIS discipline infractions Student Teacher Extended leave, Student Student Teacher moved to cover, Matching Enrollment absences? Records Instructional Course HR support staff Data entry Creative HIRING problems scheduling Specialists workflow Unique School staffing instructional adjustments approaches Mobility School
  • 37. Challenges for Classroom Level Value- Added • Threats to Validity – Student teacher linkages – Assignment of students and teachers • Mobility • Assessment Quality • Assessment Coverage • Integration
  • 38. ST Linkages: Harvesting and Validating Jeff Watson Chris Thorn Peter Witham Timothy St. Louis Center for Data Quality and Systems Innovation Value-Added Research Center UW Madison
  • 39. 2 ST Linkage Data Quality Studies • On-Site validation of ST Linkage data in a large urban district • Review of TIF districts processes for managing and verifying ST Linkage data
  • 40. Validation Study Goals: 1. Determine how good / bad ST linkage data is 2. Correct bad data 3. Collect new data (e.g., team teaching)
  • 41. Selecting a Sample • Sample method goal is to be confident that estimates of error rates are relatively accurate – Sample first to assess DQ levels • Target a small number of schools so that DQ error rates can be estimated more accurately • Census method goal is to confirm and adjust SIS ST linkages to improve the validity of classroom VA – Expanded sample as needed • Include all schools within program – Staged verification
  • 42. Determine Sample Size How to estimate sigma? Estimate of σ (in percent points) ±2.5% ±5.0% 10 64 16 15 144 36 20 256 64 25 400 100 30 576 144 Based on approach for determining samples size for a targeted confidence interval from Wackerly, Mendenhall, Scheaffer (1996)
  • 43. Target = ?? to ensure 100 Classrooms • N=100 • Target more classrooms so that the resulting sample in at least 100 • 75% participation rate • 50% content rate • Starting Sample = 100 / .75 /.5 = 267 classrooms resulted in us recruiting about 9-10 urban schools, mixed size and grade levels.
  • 44. Constraints • Control for mobility elsewhere – Student – Teacher • Focus on grades and courses that are aligned with the assessments and VA analysis – 3-8, Math/Reading – Formative assessments – End of course assessments – College readiness assessments
  • 45. Process for Validation • Collect data from district administration: – Initial data files (enrollment, entity data) – Course content case logic • Collect data from principals: – School – teacher assignments – Teacher – course linkages • Collect data from teachers: – Teacher – course assignments – Student-course assignments – Team teaching
  • 46. So…. Who is Teaching Who? Accuracy Rate 74.2% Team Taught Students 15.7% Incorrect Students 2.5% Incorrect Courses 7.6% TOTAL 100%
  • 47. Data from 9 schools in 1 urban district 100.0% 90.0% 80.0% 70.0% 60.0% 50.0% Accuracy Rate 40.0% Team Taught Students 30.0% Incorrect Students 20.0% Incorrect Courses 10.0% 0.0% A B C D E F G H I
  • 48. Counts of Student – Teacher Errors No Add. Pull-out Push-in Targeted Services Services Services Services NULL Grand Total Traditional 7952 57 0 26 0 8035 Type of Classroom Team Taught 1150 126 25 297 0 1598 “Was not in class” 10 92 2 0 163 267 “Did not teach class” 0 0 0 0 817 817 Grand Total 9112 275 27 323 980 10717
  • 49. Summary of Findings • Validated 74% of the time • Adjusted 16% of the time • Wrong 10% of the time • Schools’ ST Linkage data varied between 55% and 93% accurate • Inaccurate records were the result of: – not knowing about team teaching (4-30%) – inaccurate teacher – course linkages (0-31%) – inaccurate student – course linkages (0-15%)
  • 50. So How Do Performance Pay Projects Manage ST Linkage Data Quality? 1. How do TIF grantees obtain student-teacher linkage data? 2. How do TIF grantees validate student-teacher linkage data? 3. Are there any lessons to be learned?
  • 51. Process Flow Roster Roster Roster Census Performance SIS Rosters Verify Estimation $$Payout$$
  • 52. Process Flow Roster Roster Roster Census Performance SIS Rosters Verify Estimation Data Warehouse $$Payout$$
  • 53. Managing Complexity  Courses: Multiple subject areas, Project-based learning  Teachers: Team teaching, Cross assignments  Students: Mobility, Absenteeism
  • 54. Enhancing Performance Pay Programs • Verification: Critical and Strategic – Accuracy vetted – Enhanced stakeholder buy-in – Opportunity to capture nuance • Changes in time • Team teaching
  • 55. Leverage Personnel for Different Purposes. – When / where to involve district staff – When/where to involve principals – When/where to involve teachers
  • 56. Organization Design • TIF as a Business Driver (not compliance focused; high stakes DSS) • Multi-level connections • Technical staff involvement • Focus on managing teacher data • Implementation → Challenge → Innovation
  • 57. External partners can increase both capacity and overhead. • SIS vendors, Database mgmt., Verification processes and tools • Positive consequences: – Increased capacity, specialized and expert knowledge, external agency • Negative consequences: – Vendor silos, more project mgmt. and communication overhead for the grantee
  • 58. Summary • Classroom Value-Added requires high-quality student – teacher linkage data • SIS Student – teacher linkage data may not capture certain things well, like team teaching • Verification can be staged • Verification improves stakeholder trust and support as well as improving the validity of VA estimates and the decisions those estimates inform
  • 59. Extending the Validity and Interpretation of Value-Added Data Classroom Assignment and Matching Sara Kraemer, Robert H. Meyer, and Robin Worth
  • 60. Assignment and Value-Added • To what extent are teachers systematically assigned to certain kinds of students? – Rothstein Critique • What are the unique attributes of students and teachers? – Can we make them “observable” in the data? • What is the process of assignment in schools?
  • 61. Student Assignment & Classroom Value- Added • Classroom assignment and student-teacher matching (validity and accuracy of VA) – Implications for student-teacher linkages – Sociotechnical analysis of school processes • Sources to inform our analysis: – Principals and former principals from 3 large urban school districts • 2 districts with performance compensation systems.
  • 62. Classrooms Design and Attributes • Purposive classroom assignment – Heterogeneous (mix ability levels, demographics, etc…) – Homogeneous (sort on ability levels) – Vertical and horizontal alignment – Ability grouping – Split classrooms, looping, team teaching – Other cases: Mobility, special education students • Matching: Attributes – Teachers • Classroom management skills, personality, etc… – Student • Behavior, special cases, etc… • Social and technical considerations – Value-added – Linking students to teachers – Organizational design and process of assignment and matching
  • 63. Value added and assignment/matching • Extent of unbalanced classrooms? – Ex: Teachers w/ strong classroom management skills • Heterogeneous assignment perceived as more “fair” • Implications for measuring “true” teacher effectiveness • Some characteristics not measured in VA model
  • 64. Social and organizational factors in assignment/matching - 1 • Process of assignment and matching in schools – Principal led – Teams of teachers (within and across grades) – School leader/teacher/principal hybrid team – Collaborative versus non-collaborative model – Timing: End of academic year • Input factors – Test scores, input from teachers/school leaders, principal observation, parent input • Environmental issues – Teacher labor issues (turnover/retention, shortages, teacher allocation)
  • 65. Social and organizational factors in assignment/matching - 2 • Contextual factors – Levels of complexity – Size of school, number of grades served – Breadth and depths of courses offered – School types • Elementary versus high school, charters – Demographics of student, teacher population – Degrees of community and parental involvement – Mobility of students across classrooms or/and schools
  • 66. Assignment and Matching: Linking Students to Teachers in IS • Verification of student-teacher linkages in data • Information systems need capture reality of classrooms • Assignment and matching models for explanatory power in linkage data
  • 67. Next steps • How can data systems support classroom assignment and value-added validity? – Assessment of school assignment practices within a district (survey development) – System requirements for data quality validation of student-teacher linkages – Data systems or other tools for school-level decision support – Reporting to schools, includes assignment assessment

Editor's Notes

  1. Three different ways of measuring student performance include attainment, gain or growth, and Value-Added.The following non-education example, the Oak Tree Analogy, tries to illustrate the difference between these measures. All can be based on the same data, but the information each tells us can be very different.
  2. This Oak Tree Analogy was created to introduce the concept of Value-Added calculations. It is not in the education context in an attempt to keep this overview of the theory of Value-Added separate from details specific to its use in education.
  3. In this analogy, we will be explaining the concept of Value-Added by evaluating the performance of two gardeners.For the past year, these gardeners have been tending to their oak trees trying to maximize the height of the trees. Each gardener used a variety of strategies to help their own tree grow. We want to evaluate which of these two gardeners was more successful with their strategies.
  4. To measure the performance of the gardeners, we will measure the height of the trees today, 1 year after they began tending to the trees.With a height of 61 inches for Oak Tree A and 72 inches for Oak Tree B, we find Gardener B to be the superior gardener.This method is analogous to using an Attainment Model to evaluate performance.
  5. …but this attainment result does not tell the whole story.These gardeners did not start with acorns. The trees are 4 years old at this point in time.We need to find the starting height for each tree in order to more fairly evaluate each gardener’s performance during the past year.Looking back at our yearly record, we can see that the trees were much shorter last year.
  6. We can compare the height of the trees one year ago to the height today.By finding the difference between these heights, we can determine how many inches the trees grew during the year of gardener’s care.By using this method, Gardener A’s tree grew 14 inches while Gardener B’s tree grew 20 inches. Oak B had more growth this year, so Gardener B is the superior gardener.This is analogous to using a Simple Growth Model, also called Gain.
  7. But this Simple Growth result does not tell the whole story either.Although we know how many inches the trees grew during this year, we do not yet know how much of this growth was due to the strategies used by the gardeners themselves.This is an “apples to oranges” comparison.If we really want to fairly evaluate the gardeners, we need to take into account other factors that influenced the growth of the trees.For our oak tree example, three environmental factors we will examine are: Rainfall, Soil Richness, and Temperature.
  8. Based on the data for our trees, we can see what kind of external conditions our two trees experienced during the last year.Oak Tree A was in a region with High rainfall while Oak Tree B experienced Low rainfall.Oak Tree A had low soil richness while Oak Tree B has high soil richness.Oak Tree A had high temperature while Oak Tree B had low temperature.
  9. Our final goal is to determine how much the gardeners’ own strategies contributed to the growth of the trees.If we can determine the impact of rainfall, soil richness, and temperature on the growth of the trees, we can then take out each environmental factor’s contribution to growth.After these external factors are accounted for, we will be left with the effect of just the gardeners.To find the correct adjustments, we will analyze data from all oaks in the region.
  10. In order to find the impact of rainfall, soil richness, and temperature, we will plot the growth of each individual oak in the region compared to its environmental conditions.On the x-axis, we plot the relative amount of each environmental condition. On the y-axis, we plot how much each tree grew from year 3 to year 4.Each dot represents a single oak tree in the area. By calculating an average line through the data, we can determine a trend for each environmental factor.From the data we collected for our region, we find that more rainfall and higher soil richness contributed positively to growth. Higher temperatures contributed negatively to growth.
  11. Now that we have identified growth trends for each of these environmental factors, we need to convert them into a form usable for our calculations.We can summarize our trend information by determining a numerical adjustment based on High, Medium, and Low amount of each environmental condition.For example, based on our data, we found that oak trees that experienced low rainfall tended to have 5 fewer inches of growth compared to the average growth of oak trees in the region. Trees with medium rainfall tended to have 2 fewer inches of growth compared to the average. Trees with high rainfall tended to have 3 more inches of growth compared to the average.We calculate these numerical adjustments for all environmental conditions to summarize the trends from the data.Now we can go back to Oak A and Oak B to adjust for their growing conditions.
  12. To calculate our new adjusted growth, we start with the simple growth we calculated earlier.Next, we will use our numerical adjustments to account for the effect of each tree’s environmental conditions. Essentially, we will be removing the effect of these factors.After we have removed the effect of these external conditions, we will be left with the portion of growth that was due to the gardeners’ own strategies.When we are done, we will have an “apples to apples” comparison of the gardeners’ influence on growth.
  13. Based on data for all oak trees in the region, we found that high rainfall resulted in 3 inches of extra growth on average.For having high rainfall, Oak A’s growth is adjusted by -3 to compensate.In other words, it was easier for Oak Tree A to make growth due to rainfall conditions that were not caused by the gardener. Since we are interested in what gardener A alone contributed to the growth of the tree, we take away 3 inches of growth to remove the contribution high rainfall had on Oak Tree A’s growth.Similarly, for having low rainfall, Oak B’s growth is adjusted by +5 to compensate.It was harder for Oak Tree B to make growth due to rainfall conditions not caused by the gardener. Since we are interested in what gardener B alone contributed to the growth of the tree, we add 5 inches of growth to remove the effect of low rainfall on Oak Tree B’s growth.
  14. We continue this process for our other environmental factors.For having poor soil, Oak A’s growth is adjusted by +3.For having rich soil, Oak B’s growth is adjusted by -2.
  15. For having high temperature, Oak A’s growth is adjusted by +8.For having low temperature, Oak B’s growth is adjusted by -5.
  16. Now that we have removed the effect of environmental conditions, our adjusted growth result puts the gardeners on a level playing field. What we are left with is the effect of the gardeners themselves.We calculate that Gardener A’s effect on Oak A is +22 inches.We calculate that Gardener B’s effect on Oak B is +18 inches.
  17. Using this method, Gardener A is the superior gardener.By accounting for last year’s height and environmental conditions of the trees during this year, we have found the “value” each gardener “added” to the growth of the tree.This is analogous to a Value-Added Model.
  18. This analogy was purposefully kept out of the education context. How does this analogy relate to Value-Added calculations in the education context?What are we evaluating? In the oak tree analogy, we evaluated gardeners. In the education context, we are evaluating districts, schools, grades, classrooms, programs, and interventions.What are we using to measure success? In the oak tree analogy, we measure growth in inches. In the education context, we measure relative growth in scale score points.What about our sample? In the oak tree analogy, we only used a single oak tree per gardener. In the education context, we use groups of students.What do we control for? In the oak tree analogy, we analyzed rainfall, soil richness, and temperature. We were then able to remove their contribution to growth. We call this “controlling” for these factors. In the education context, we control for prior performance. This tends to be the most significant predictor of student performance. Based on what other data is available for ALL students, we also control for other factors beyond the district, school, or classroom’s influence, such as:Free/Reduced Lunch StatusEnglish Language Learner StatusIEP / Special Education StatusRaceGender