SlideShare a Scribd company logo
1 of 115
EVAAS for Educators



  Beginning – Intermediate – Advance
                  Date
Today’s Presenters
  Dr. Jody Cleven
  Professional Development
  Consultant
  Region 5
  Jody.cleven@dpi.nc.gov




  Paul Marshall
  Professional Development
  Consultant
  Region 5
  Paul.marshall@dpi.nc.gov
Our Agenda
  •   Welcome, Introductions, Agenda Overview
  •   EVAAS and Data
  •   System Overview
  •   Growth Vs. Proficiency
  •   Pre-Assessment
  •   Reflective Assessments
  •   Exit Ticket




                                                3
Outcomes:

• Be familiar with reflective assessments
• Understand the various EVAAS reports
Can We Agree?
•   To be actively involved
•   Value differences
•   Agree to disagree
•   Listen
Resources
Additional Resources:
• EVAAS Wiki Page Here
Virtual Professional Development




            https://ncdpi.sas.com
Data Literacy Module
              https://center.ncsu.edu/nc


                Data Resource Guide
http://www.ncpublicschools.org/acre/improvement/resources/
Pre-Assessment
What do you know about
 EVAAS?

• http://www.socrative.com/
• Click: Student Log in,
  Room: 92564
It’s Connected
What is EVAAS?




                 So What Does It Do?
What is Data?


                Data can be defined as
          information organized for analysis
              or used to make decisions.
What is Data Literacy?

      Understanding needed to:
               • Find
              •Evaluate
              •Utilize

        to inform instruction.
A Data Literate Person Can…

A data literate person possesses the knowledge
   to gather, analyze, and graphically convey
   information to support short and long-term
                 decision-making.
NC Professional Teaching Standards
Standard I: Teachers demonstrate leadership.
 Take responsibility for the progress of all students
 Use data to organize, plan, and set goals
 Use a variety of assessment data throughout the year to evaluate progress
 Analyze data


Standard IV: Teachers facilitate learning for their students.
 Use data for short and long range planning


Standard V: Teachers are reflective on their practice.
 Collect and analyze student performance data to improve effectiveness
Standard 6 for Teachers

  Teachers contribute to the academic
         success of students.
  The work of the teacher results in acceptable,
  measurable progress for students based on
  established performance expectations using
  appropriate data to demonstrate growth.
Benefits and Considerations for
    Teachers
•                                           Professional Development
    Understand academic
    preparedness of students                        is the Key
    before they enter the classroom.
                                       •   Culture of School
•   Monitor student progress,
    ensuring growth opportunities      •   Sensitivity of Data
    for all students.
                                       •   Finger Pointing and Blame
•   Modify curriculum, student             Game
    support, and instructional
                                       •   Window vs. Mirror
    strategies to address the needs
    of all students.
NC Standards for School Executives
Standard 2: Instructional Leadership
•   Focuses his or her own and others’ attention persistently and
    publicly on learning and teaching by initiating and guiding
    conversations about instruction and student learning that are
    oriented towards high expectations and concrete goals;
•   Creates processes for collecting and using student test data
    and other formative data from other sources for the
    improvement of instruction
•   Ensures that there is an appropriate and logical alignment
    between the curriculum of the school and the state’s
    accountability program
•   Creates processes for collecting and using student test data
    and other formative data from other sources for the
    improvement of instruction
Standard 8 for School Executives

     Academic Achievement Leadership
School executives will contribute to the
academic success of students. The work of
the school executive will result in acceptable,
measurable progress for students based on
established performance expectations using
appropriate data to demonstrate growth.
Benefits for Principals
• Gain a consolidated view of student progress and
  teacher effectiveness, as well as the impact of
  instruction and performance.
• Bring clarity to strategic planning and function as a
  catalyst for conversations that must take place to
  ensure that all students reach their potential.
• Understand and leverage the strengths of effective
  teachers.
• Use the valuable resource of effective teaching to
  benefit as many students as possible.
ACHIEVEMENT VS. GROWTH
Student Achievement



   Proficient




                        End of
                      School Year
Student Progress



     Proficient
                                     r
                                   ve
                                geo
                              an me
                            Ch ti
  Not Proficient



                     Start of              End of
                   School Year           School Year
Student Progress
   Advanced              Cha
                             nge
                                 o
                            time ver

   Proficient




                  Start of               End of
                School Year            School Year
Student Progress
     Advanced               Cha
                                nge
                                    o
                               time ver

     Proficient
                                     r
                                   ve
                                geo
                              an me
                            Ch ti
  Not Proficient



                     Start of               End of
                   School Year            School Year
Achievement vs. Growth

Student Achievement: Where are we?
•Highly correlated with demographic factors

Student Growth: How far have we come?
•Highly dependent on what happens as a result of
schooling rather than on demographic factors
The EVAAS Philosophy

• All students deserve opportunities to make appropriate
  academic progress every year.
• There is no “one size fits all” way of educating students
  who enter a class at different levels of academic
  achievement.
The EVAAS Philosophy


• Adjustments to instruction should be based on the
  students’ academic needs, not on socio-economic
  factors.
• "What teachers know and can do is the most important
  influence on what students learn." (National
  Commission on Teaching and America's Future, 1996)
Achievement and Poverty




                How is this fair?
Academic Growth and Poverty




      No one is doomed to failure.
Proficiency vs Growth

           Scenario                Proficient   Growth


5th grader begins the year
reading at a 1st grade level.
                                      NO         YES
Ends the year reading at a 4th
grade level.
5th grader begins the year
reading at a 7th grade level.
Ends the year reading at the 7th     YES         NO
grade level.
EVAAS Overview
What is EVAAS?




                 So What Does It Do?
What is EVAAS?
How can EVAAS help me?
Education Value Added
                   Assessment System

– Answers the question of how effective a schooling
  experience is
– Produces reports that
   • Predict student success
   • Show the effects of schooling at particular schools
   • Reveal patterns in subgroup performance
Changes in Reporting for 2012-13

      2011-12            2012-13

        Above         Exceeds Expected
                           Growth

     Not Detectably    Meets Expected
       Different          Growth

        Below          Does Not Meet
                      Expected Growth
District Value Added Report

•Use to evaluate the overall effectiveness of a district on
student progress

•Compares each district to the average district in the
state for each subject tested in the given year

•Indicates how a district influences student progress in
the tested subjects
Value-Added Reporting
Value-Added Reporting
Value-Added Reporting

                   The NCE Base is by definition set at 50.0,
                   and it represents the average attainment
                   level of students in the grade and subject,
                   statewide.



If the school mean is greater, the average student in the
school is performing at a higher achievement level than the
average student in the state.
District Diagnostic Reports


 • Use to identify patterns or trends of progress
   among students expected to score at different
   achievement levels
Diagnostic Report
District Performance Diagnostic Reports


•   Use to identify patterns or trends or progress among students
    predicted to score at different performance levels as determined by
    their scores on NC tests
•   Students assigned to Projected Performance Levels based on their
    predicted scores
•   Shows the number (Nr) and percentage of students in the district
    that fall into each Projected Performance Level
District Performance Diagnostic Reports
Interpreting the Pie Chart
              Green




Yellow
                             Light Red
Reflective
Assessments
Value-Added Reports
Diagnostic Reports – the whiskers
Diagnostic Reports
Looking for Patterns
School Diagnostic
Shed Pattern
School Diagnostic
Reverse Shed Pattern
School Diagnostic
Tent Pattern
School Diagnostic
V Pattern
School Diagnostic
Opportunity Gap Pattern
What would an ideal pattern on a Diagnostic
                   Report
look like for closing the achievement gap?
Diagnostic Reports – Desirable
Pattern
Diagnostic Report
Desirable Pattern
DIAGNOSTIC &
PERFORMANCE DIAGNOSTIC
REPORTS (PART 2)
Diagnostic Reports – the whiskers
Overview of School Effects (sample data)
Overview of School Effects (sample data)
Overview of School Effects (sample data)
Overview of School Effects (sample data)
Overview of School Effects (sample data)
Overview of School Effects (sample data)
1. Go to the website
     www.ncdpi.sas.com
1. Go to the website
          ncdpi.sas.com
1. Go to ncdpi.sas.com

       2.   BOOKMARK IT!

                 3. Secure & Convenient
                      Online Login
Do you see this?




                   Then Sit Tight!
Overview of School Effects
It’s Your Turn!
•Find the blank table.


Do this by yourself.
•Using sample data
•Fill in your table.
Overview of School Effects
 What did you find?
 •Interesting Patterns
 •Insights
 •Areas of Concern
 •Areas of Celebration
1. Go to the website
          ncdpi.sas.com
Finding Your Patterns
Interpreting Your Results




                   Microsoft Word
                     Document
Student Pattern Report
Student Patterns Report

Key points to remember:

•The report shows growth for the lowest, middle, and highest
achieving students within the chosen group.
•The report can be used to explore the progress of students with
similar educational opportunities.
•Like all diagnostic reports, this report is for diagnostic purposes only.
•A minimum of 15 students is needed to create a Student Pattern
Report.
Student Pattern Report
Student Pattern Report
Key
Questions
Student Pattern Report – Key Questions

                            Different experience?
                            Different strategies?
                            Different needs?
                            Number of hours?
Student Pattern Report – Key Questions



                           Different experience?
                           Different strategies?
                           Different needs?
                           Number of hours?
                               YES!
                              Rerun the report
                              with new criteria.
Student Pattern Report – Next Steps

      All 31                16 Students
   Students in             who attended
  the Program              for 40+ hours
Less Informed Conclusion: We
need to change the selection
criteria for this program.

 More Informed Conclusion: We need to adjust the recommended
 hours for participants.
Proactive
Assessments
Academic At-Risk Reports
                 • Reports
                   – Academic At-Risk
                     Report
Academic At-Risk Reports

3 Categories

AYP at Risk- at risk for not meeting the academic
indicators for AYP
Graduation at Risk-reports for students at risk for
not making a Level III on EOC subjects required
for graduation
Other at Risk-reports for students at risk for not
making Level III on other EOC subjects
Academic at Risk Reports
Be Proactive


Use these reports to determine local policy for
providing targeted intervention and support to
students who are at risk for not meeting future
academic milestones.




                                          91
Making Data Driven Decisions
What Are Projections?
What Are Projections Anyway?

                Given a specific set
                of circumstances…

                …what’s the most
                likely outcome?
What Are Projections Anyway?
                Given this student’s testing
                history, across subjects…

                …what is the student likely
                to score on an upcoming
                test, assuming the student
                has the average schooling
                experience?
EVAAS Projections
What are they based on?

• Expectations based on what we know
    » About this student and other students who have
      already taken this test
    » Prior test scores (EOC/EOG), across subjects
    » Their scores on the test we’re projecting to
What’s the Value of the Projections?

 Projections are NOT about
 predicting the future.


                               They ARE
                          about assessing
                       students’ academic
                           needs TODAY.
Assessing Students’ Needs
• What are this student’s chances for success?
• What goals should we have for this student this
  year?
• What goals should we have for this student in
  future years?



What can I do to help this student get there?
Using Projections to Take Action

                  • Identify students
                  • Assess the level of risk
                  • Plan schedules
                  • Identify high-achievers
                  • Assess the opportunities
                  • Inform
Making Data Driven Decisions
Data Mining




              Microsoft Word
                Document
REFLECTION + PROJECTION =
TODAY
Student Project Report
Student Project Report
Student Project Report
Thinking of the State Distribution by
QUINTILES
       QUINTILE 5

       QUINTILE 4

       QUINTILE 3

       QUINTILE 2

       QUINTILE 1
Note the Student’s Projected
QUINTILE




                               QUINTILE 2
Reflecting on Past Effectiveness to
Plan for Differentiating Student
Instruction


                          Past Effectiveness
Entering Achievement
Reflecting on Past Effectiveness to Plan for
        Differentiating Student Instruction

                              QUINTILE 2

                                  Past Effectiveness
Entering Achievement
ACADEMIC PREPAREDNESS
REPORT
Academic Preparedness Report




                               111
CUSTOM STUDENT REPORT
Custom Student Report HANDOUT
Thank You!!!!
Dr. Jody Cleven
Professional Development
Consultant
Region 5
Jody.cleven@dpi.nc.gov




Paul Marshall
Professional Development
Consultant
Region 5
Paul.marshall@dpi.nc.gov
Thank You!
                              Presenter 2
  Presenter 1
                              Professional Development
  Professional Development
                              Consultant
  Consultant
                              Region ?
  Region ?
                              email@dpi.nc.gov
  email@dpi.nc.gov




Presenter 3                  Presenter 4
Professional Development     Professional Development
Consultant                   Consultant
Region ?                     Region ?
email@dpi.nc.gov             email@dpi.nc.gov

More Related Content

Similar to EVAAS

Evaas training 10 12-12 pdf
Evaas training 10 12-12 pdfEvaas training 10 12-12 pdf
Evaas training 10 12-12 pdfadriane3054
 
Sse workshop 2 spring 2014
Sse workshop 2 spring 2014Sse workshop 2 spring 2014
Sse workshop 2 spring 2014Martin Brown
 
Naviance Summer Institute 2015 Product Forum
Naviance Summer Institute 2015 Product ForumNaviance Summer Institute 2015 Product Forum
Naviance Summer Institute 2015 Product ForumNaviance
 
NCDPI NCTED Meeting April 2014
NCDPI NCTED Meeting April 2014NCDPI NCTED Meeting April 2014
NCDPI NCTED Meeting April 2014rachelmcbroom
 
Sse pp 11 november 2013
Sse pp   11 november 2013Sse pp   11 november 2013
Sse pp 11 november 2013Martin Brown
 
Sse pp 11 november 2013 (1)
Sse pp   11 november 2013 (1)Sse pp   11 november 2013 (1)
Sse pp 11 november 2013 (1)Martin Brown
 
Briargrove Elementary Assessment Policy 2013-2014
Briargrove Elementary Assessment Policy 2013-2014Briargrove Elementary Assessment Policy 2013-2014
Briargrove Elementary Assessment Policy 2013-2014eellswor
 
Exceptional student services rigorous curriculum design
Exceptional student services rigorous curriculum designExceptional student services rigorous curriculum design
Exceptional student services rigorous curriculum designToni Theisen
 
Assessment update february 2014
Assessment update february 2014Assessment update february 2014
Assessment update february 2014sycamorees
 
Baraabaru school indicators
Baraabaru school indicatorsBaraabaru school indicators
Baraabaru school indicatorsMohamed_Nazim
 
Sse workshop 2 spring 2014
Sse workshop 2 spring 2014Sse workshop 2 spring 2014
Sse workshop 2 spring 2014Martin Brown
 
2012 IHE Institutes Leaders with Leaders Session
2012 IHE Institutes Leaders with Leaders Session2012 IHE Institutes Leaders with Leaders Session
2012 IHE Institutes Leaders with Leaders Sessionrachelmcbroom
 
One Year with Naviance Curriculum
One Year with Naviance Curriculum One Year with Naviance Curriculum
One Year with Naviance Curriculum Naviance
 
NSI 2014: Product Forum
NSI 2014: Product ForumNSI 2014: Product Forum
NSI 2014: Product ForumNaviance
 
How Tracking Student Data Improves Teaching Practices by Bernard Pierorazio
How Tracking Student Data Improves Teaching Practices by Bernard PierorazioHow Tracking Student Data Improves Teaching Practices by Bernard Pierorazio
How Tracking Student Data Improves Teaching Practices by Bernard PierorazioBernard Pierorazio
 
Connecticut mesuring and modeling growth
Connecticut   mesuring and modeling growthConnecticut   mesuring and modeling growth
Connecticut mesuring and modeling growthJohn Cronin
 
Connecticut mesuring and modeling growth
Connecticut   mesuring and modeling growthConnecticut   mesuring and modeling growth
Connecticut mesuring and modeling growthJohn Cronin
 

Similar to EVAAS (20)

Evaas training 10 12-12 pdf
Evaas training 10 12-12 pdfEvaas training 10 12-12 pdf
Evaas training 10 12-12 pdf
 
Sse workshop 2 spring 2014
Sse workshop 2 spring 2014Sse workshop 2 spring 2014
Sse workshop 2 spring 2014
 
Naviance Summer Institute 2015 Product Forum
Naviance Summer Institute 2015 Product ForumNaviance Summer Institute 2015 Product Forum
Naviance Summer Institute 2015 Product Forum
 
NCDPI NCTED Meeting April 2014
NCDPI NCTED Meeting April 2014NCDPI NCTED Meeting April 2014
NCDPI NCTED Meeting April 2014
 
NASPA AnP 2014
NASPA AnP 2014NASPA AnP 2014
NASPA AnP 2014
 
Sse pp 11 november 2013
Sse pp   11 november 2013Sse pp   11 november 2013
Sse pp 11 november 2013
 
Sse pp 11 november 2013 (1)
Sse pp   11 november 2013 (1)Sse pp   11 november 2013 (1)
Sse pp 11 november 2013 (1)
 
Briargrove Elementary Assessment Policy 2013-2014
Briargrove Elementary Assessment Policy 2013-2014Briargrove Elementary Assessment Policy 2013-2014
Briargrove Elementary Assessment Policy 2013-2014
 
Exceptional student services rigorous curriculum design
Exceptional student services rigorous curriculum designExceptional student services rigorous curriculum design
Exceptional student services rigorous curriculum design
 
Jessica Campora RESUME FINAL
Jessica Campora RESUME FINALJessica Campora RESUME FINAL
Jessica Campora RESUME FINAL
 
Middle School Conference EVAAS Workshop 2012
Middle School Conference EVAAS Workshop 2012Middle School Conference EVAAS Workshop 2012
Middle School Conference EVAAS Workshop 2012
 
Assessment update february 2014
Assessment update february 2014Assessment update february 2014
Assessment update february 2014
 
Baraabaru school indicators
Baraabaru school indicatorsBaraabaru school indicators
Baraabaru school indicators
 
Sse workshop 2 spring 2014
Sse workshop 2 spring 2014Sse workshop 2 spring 2014
Sse workshop 2 spring 2014
 
2012 IHE Institutes Leaders with Leaders Session
2012 IHE Institutes Leaders with Leaders Session2012 IHE Institutes Leaders with Leaders Session
2012 IHE Institutes Leaders with Leaders Session
 
One Year with Naviance Curriculum
One Year with Naviance Curriculum One Year with Naviance Curriculum
One Year with Naviance Curriculum
 
NSI 2014: Product Forum
NSI 2014: Product ForumNSI 2014: Product Forum
NSI 2014: Product Forum
 
How Tracking Student Data Improves Teaching Practices by Bernard Pierorazio
How Tracking Student Data Improves Teaching Practices by Bernard PierorazioHow Tracking Student Data Improves Teaching Practices by Bernard Pierorazio
How Tracking Student Data Improves Teaching Practices by Bernard Pierorazio
 
Connecticut mesuring and modeling growth
Connecticut   mesuring and modeling growthConnecticut   mesuring and modeling growth
Connecticut mesuring and modeling growth
 
Connecticut mesuring and modeling growth
Connecticut   mesuring and modeling growthConnecticut   mesuring and modeling growth
Connecticut mesuring and modeling growth
 

More from hardyrcs

Every student-ready (1)
Every student-ready (1)Every student-ready (1)
Every student-ready (1)hardyrcs
 
Teacherreports
TeacherreportsTeacherreports
Teacherreportshardyrcs
 
Smarterbalancedassessments10 12
Smarterbalancedassessments10 12Smarterbalancedassessments10 12
Smarterbalancedassessments10 12hardyrcs
 
Ncascd october18 ncpd_ipresentation2
Ncascd october18 ncpd_ipresentation2Ncascd october18 ncpd_ipresentation2
Ncascd october18 ncpd_ipresentation2hardyrcs
 
9 20-12meetingrevised
9 20-12meetingrevised9 20-12meetingrevised
9 20-12meetingrevisedhardyrcs
 
9 20-12meetingrevised
9 20-12meetingrevised9 20-12meetingrevised
9 20-12meetingrevisedhardyrcs
 
September+20+slide+deck
September+20+slide+deckSeptember+20+slide+deck
September+20+slide+deckhardyrcs
 
Standard 6 & 8
Standard 6 & 8Standard 6 & 8
Standard 6 & 8hardyrcs
 

More from hardyrcs (8)

Every student-ready (1)
Every student-ready (1)Every student-ready (1)
Every student-ready (1)
 
Teacherreports
TeacherreportsTeacherreports
Teacherreports
 
Smarterbalancedassessments10 12
Smarterbalancedassessments10 12Smarterbalancedassessments10 12
Smarterbalancedassessments10 12
 
Ncascd october18 ncpd_ipresentation2
Ncascd october18 ncpd_ipresentation2Ncascd october18 ncpd_ipresentation2
Ncascd october18 ncpd_ipresentation2
 
9 20-12meetingrevised
9 20-12meetingrevised9 20-12meetingrevised
9 20-12meetingrevised
 
9 20-12meetingrevised
9 20-12meetingrevised9 20-12meetingrevised
9 20-12meetingrevised
 
September+20+slide+deck
September+20+slide+deckSeptember+20+slide+deck
September+20+slide+deck
 
Standard 6 & 8
Standard 6 & 8Standard 6 & 8
Standard 6 & 8
 

EVAAS

  • 1. EVAAS for Educators Beginning – Intermediate – Advance Date
  • 2. Today’s Presenters Dr. Jody Cleven Professional Development Consultant Region 5 Jody.cleven@dpi.nc.gov Paul Marshall Professional Development Consultant Region 5 Paul.marshall@dpi.nc.gov
  • 3. Our Agenda • Welcome, Introductions, Agenda Overview • EVAAS and Data • System Overview • Growth Vs. Proficiency • Pre-Assessment • Reflective Assessments • Exit Ticket 3
  • 4. Outcomes: • Be familiar with reflective assessments • Understand the various EVAAS reports
  • 5. Can We Agree? • To be actively involved • Value differences • Agree to disagree • Listen
  • 8. Virtual Professional Development https://ncdpi.sas.com
  • 9. Data Literacy Module https://center.ncsu.edu/nc Data Resource Guide http://www.ncpublicschools.org/acre/improvement/resources/
  • 11. What do you know about EVAAS? • http://www.socrative.com/ • Click: Student Log in, Room: 92564
  • 13. What is EVAAS? So What Does It Do?
  • 14. What is Data? Data can be defined as information organized for analysis or used to make decisions.
  • 15. What is Data Literacy? Understanding needed to: • Find •Evaluate •Utilize to inform instruction.
  • 16. A Data Literate Person Can… A data literate person possesses the knowledge to gather, analyze, and graphically convey information to support short and long-term decision-making.
  • 17. NC Professional Teaching Standards Standard I: Teachers demonstrate leadership.  Take responsibility for the progress of all students  Use data to organize, plan, and set goals  Use a variety of assessment data throughout the year to evaluate progress  Analyze data Standard IV: Teachers facilitate learning for their students.  Use data for short and long range planning Standard V: Teachers are reflective on their practice.  Collect and analyze student performance data to improve effectiveness
  • 18. Standard 6 for Teachers Teachers contribute to the academic success of students. The work of the teacher results in acceptable, measurable progress for students based on established performance expectations using appropriate data to demonstrate growth.
  • 19. Benefits and Considerations for Teachers • Professional Development Understand academic preparedness of students is the Key before they enter the classroom. • Culture of School • Monitor student progress, ensuring growth opportunities • Sensitivity of Data for all students. • Finger Pointing and Blame • Modify curriculum, student Game support, and instructional • Window vs. Mirror strategies to address the needs of all students.
  • 20. NC Standards for School Executives Standard 2: Instructional Leadership • Focuses his or her own and others’ attention persistently and publicly on learning and teaching by initiating and guiding conversations about instruction and student learning that are oriented towards high expectations and concrete goals; • Creates processes for collecting and using student test data and other formative data from other sources for the improvement of instruction • Ensures that there is an appropriate and logical alignment between the curriculum of the school and the state’s accountability program • Creates processes for collecting and using student test data and other formative data from other sources for the improvement of instruction
  • 21. Standard 8 for School Executives Academic Achievement Leadership School executives will contribute to the academic success of students. The work of the school executive will result in acceptable, measurable progress for students based on established performance expectations using appropriate data to demonstrate growth.
  • 22. Benefits for Principals • Gain a consolidated view of student progress and teacher effectiveness, as well as the impact of instruction and performance. • Bring clarity to strategic planning and function as a catalyst for conversations that must take place to ensure that all students reach their potential. • Understand and leverage the strengths of effective teachers. • Use the valuable resource of effective teaching to benefit as many students as possible.
  • 24. Student Achievement Proficient End of School Year
  • 25. Student Progress Proficient r ve geo an me Ch ti Not Proficient Start of End of School Year School Year
  • 26. Student Progress Advanced Cha nge o time ver Proficient Start of End of School Year School Year
  • 27. Student Progress Advanced Cha nge o time ver Proficient r ve geo an me Ch ti Not Proficient Start of End of School Year School Year
  • 28. Achievement vs. Growth Student Achievement: Where are we? •Highly correlated with demographic factors Student Growth: How far have we come? •Highly dependent on what happens as a result of schooling rather than on demographic factors
  • 29. The EVAAS Philosophy • All students deserve opportunities to make appropriate academic progress every year. • There is no “one size fits all” way of educating students who enter a class at different levels of academic achievement.
  • 30. The EVAAS Philosophy • Adjustments to instruction should be based on the students’ academic needs, not on socio-economic factors. • "What teachers know and can do is the most important influence on what students learn." (National Commission on Teaching and America's Future, 1996)
  • 31. Achievement and Poverty How is this fair?
  • 32. Academic Growth and Poverty No one is doomed to failure.
  • 33. Proficiency vs Growth Scenario Proficient Growth 5th grader begins the year reading at a 1st grade level. NO YES Ends the year reading at a 4th grade level. 5th grader begins the year reading at a 7th grade level. Ends the year reading at the 7th YES NO grade level.
  • 35.
  • 36. What is EVAAS? So What Does It Do?
  • 38. How can EVAAS help me?
  • 39. Education Value Added Assessment System – Answers the question of how effective a schooling experience is – Produces reports that • Predict student success • Show the effects of schooling at particular schools • Reveal patterns in subgroup performance
  • 40. Changes in Reporting for 2012-13 2011-12 2012-13 Above Exceeds Expected Growth Not Detectably Meets Expected Different Growth Below Does Not Meet Expected Growth
  • 41. District Value Added Report •Use to evaluate the overall effectiveness of a district on student progress •Compares each district to the average district in the state for each subject tested in the given year •Indicates how a district influences student progress in the tested subjects
  • 44. Value-Added Reporting The NCE Base is by definition set at 50.0, and it represents the average attainment level of students in the grade and subject, statewide. If the school mean is greater, the average student in the school is performing at a higher achievement level than the average student in the state.
  • 45. District Diagnostic Reports • Use to identify patterns or trends of progress among students expected to score at different achievement levels
  • 47. District Performance Diagnostic Reports • Use to identify patterns or trends or progress among students predicted to score at different performance levels as determined by their scores on NC tests • Students assigned to Projected Performance Levels based on their predicted scores • Shows the number (Nr) and percentage of students in the district that fall into each Projected Performance Level
  • 49. Interpreting the Pie Chart Green Yellow Light Red
  • 52. Diagnostic Reports – the whiskers
  • 59. What would an ideal pattern on a Diagnostic Report look like for closing the achievement gap?
  • 60. Diagnostic Reports – Desirable Pattern
  • 63. Diagnostic Reports – the whiskers
  • 64. Overview of School Effects (sample data)
  • 65. Overview of School Effects (sample data)
  • 66. Overview of School Effects (sample data)
  • 67. Overview of School Effects (sample data)
  • 68. Overview of School Effects (sample data)
  • 69. Overview of School Effects (sample data)
  • 70. 1. Go to the website www.ncdpi.sas.com
  • 71. 1. Go to the website ncdpi.sas.com
  • 72. 1. Go to ncdpi.sas.com 2. BOOKMARK IT! 3. Secure & Convenient Online Login
  • 73. Do you see this? Then Sit Tight!
  • 74. Overview of School Effects It’s Your Turn! •Find the blank table. Do this by yourself. •Using sample data •Fill in your table.
  • 75. Overview of School Effects What did you find? •Interesting Patterns •Insights •Areas of Concern •Areas of Celebration
  • 76. 1. Go to the website ncdpi.sas.com
  • 78. Interpreting Your Results Microsoft Word Document
  • 80. Student Patterns Report Key points to remember: •The report shows growth for the lowest, middle, and highest achieving students within the chosen group. •The report can be used to explore the progress of students with similar educational opportunities. •Like all diagnostic reports, this report is for diagnostic purposes only. •A minimum of 15 students is needed to create a Student Pattern Report.
  • 84. Student Pattern Report – Key Questions Different experience? Different strategies? Different needs? Number of hours?
  • 85. Student Pattern Report – Key Questions Different experience? Different strategies? Different needs? Number of hours? YES! Rerun the report with new criteria.
  • 86. Student Pattern Report – Next Steps All 31 16 Students Students in who attended the Program for 40+ hours
  • 87. Less Informed Conclusion: We need to change the selection criteria for this program. More Informed Conclusion: We need to adjust the recommended hours for participants.
  • 89. Academic At-Risk Reports • Reports – Academic At-Risk Report
  • 90. Academic At-Risk Reports 3 Categories AYP at Risk- at risk for not meeting the academic indicators for AYP Graduation at Risk-reports for students at risk for not making a Level III on EOC subjects required for graduation Other at Risk-reports for students at risk for not making Level III on other EOC subjects
  • 91. Academic at Risk Reports Be Proactive Use these reports to determine local policy for providing targeted intervention and support to students who are at risk for not meeting future academic milestones. 91
  • 92. Making Data Driven Decisions
  • 94. What Are Projections Anyway? Given a specific set of circumstances… …what’s the most likely outcome?
  • 95. What Are Projections Anyway? Given this student’s testing history, across subjects… …what is the student likely to score on an upcoming test, assuming the student has the average schooling experience?
  • 96. EVAAS Projections What are they based on? • Expectations based on what we know » About this student and other students who have already taken this test » Prior test scores (EOC/EOG), across subjects » Their scores on the test we’re projecting to
  • 97. What’s the Value of the Projections? Projections are NOT about predicting the future. They ARE about assessing students’ academic needs TODAY.
  • 98. Assessing Students’ Needs • What are this student’s chances for success? • What goals should we have for this student this year? • What goals should we have for this student in future years? What can I do to help this student get there?
  • 99. Using Projections to Take Action • Identify students • Assess the level of risk • Plan schedules • Identify high-achievers • Assess the opportunities • Inform
  • 100. Making Data Driven Decisions
  • 101. Data Mining Microsoft Word Document
  • 106. Thinking of the State Distribution by QUINTILES QUINTILE 5 QUINTILE 4 QUINTILE 3 QUINTILE 2 QUINTILE 1
  • 107. Note the Student’s Projected QUINTILE QUINTILE 2
  • 108. Reflecting on Past Effectiveness to Plan for Differentiating Student Instruction Past Effectiveness Entering Achievement
  • 109. Reflecting on Past Effectiveness to Plan for Differentiating Student Instruction QUINTILE 2 Past Effectiveness Entering Achievement
  • 114. Thank You!!!! Dr. Jody Cleven Professional Development Consultant Region 5 Jody.cleven@dpi.nc.gov Paul Marshall Professional Development Consultant Region 5 Paul.marshall@dpi.nc.gov
  • 115. Thank You! Presenter 2 Presenter 1 Professional Development Professional Development Consultant Consultant Region ? Region ? email@dpi.nc.gov email@dpi.nc.gov Presenter 3 Presenter 4 Professional Development Professional Development Consultant Consultant Region ? Region ? email@dpi.nc.gov email@dpi.nc.gov

Editor's Notes

  1. 1 min. Today’s presentation is being brought to you by… Click
  2. Facilitator Preference – may want a presenter and a driver, use split screens
  3. Link to EVAAS wiki once created http://evaas.ncdpi.wikispaces.net/home
  4. It may be a good idea to watch this…
  5. This module provides an introduction to data literacy. It includes information on types of data, strategies for analyzing and understanding data, and processes for determining how these can influence instructional practices. This module aims to provide learning experiences that develop or enhance abilities to find, evaluate, and use data to inform instruction. of the Data Resource Guide is to provide information and resources to help administrators, data coaches, teachers, and support staff with data driven decision making for their schools and districts. Districts and charter schools are encouraged to use this guide to design and implement training for data teams with the goal of increased data literacy and student achievement.
  6. The following slides are PollEverywhere slides to get a feel for what participants already know. The PollEverywhere questions are hyperlinked on the agenda on wiki Facilitators should clear poll results after the presentation and/or check to see if the group before you cleared their results
  7. OPTION- Presenters may use either Polleverywhere or Socrative for the pre-assessment activity. http://www.socrative.com/ Click: Student Log in, Room: 92564   Teacher login – [email_address] Password – pdteam If you expect more than 50 participants, you can replace this slide with the Polleverywhere slides with the same pre-assessment questions. slide on a wiki page. Faster than texting in answer.
  8. EVAAS measures the progress students make within your district, school, or classroom, compared to the progress students make, on average, statewide. It is available to all schools and districts in North Carolina. Copyright © 2010, SAS Institute Inc. All rights reserved.
  9. Data literacy refers to the understanding needed to find, evaluate, and utilize data to inform instruction. A data literate person possesses the knowledge to gather, analyze, and graphically convey information to support short and long-term decision-making.
  10. Data literacy refers to the understanding needed to find, evaluate, and utilize data to inform instruction. A data literate person possesses the knowledge to gather, analyze, and graphically convey information to support short and long-term decision-making.
  11. robin
  12. Talk about how PD is imperative. We have to teach the teachers HOW to use the data. Having DATA CONVERSATIONS is an imperative. Knowledge is power, and your role as a principal is to prepare teachers for how to handle the data. Understanding that the purpose of EVAAS is for us to support student understanding and to make appropriate instructional, logistical, and professional decisions to support student achievement and growth. You know your teachers and know who may not be able to handle access.
  13. But also relates to Standard 3 Cultural Leadership – fair and consistent evaluations of teachers, provides for differentiated pd according to teachers’ needs, etc. And to Standard 7 Micropolitical Leadership – allocation of resources, communication within the school.
  14. ) Often in middle schools, we test students by homeroom, and when our data returns, we must, at the school level, redistribute the students to align with the teacher that taught those students. EVAAS, however, is sophisticated enough to put kids in the classes where they belong because it pulls the student assignment from NCWISE. If the school’s NCWISE data is correct, the student will be properly placed with the appropriate teacher for each course. Talk about working with Kim to pull prediction data and used it in data conversations with every single EOC teacher. She provided each teacher with this information. Planning and discussion sessions. Used data in scheduling as well.
  15. A focus on achievement or proficiency looks like this… Student is able to meet specific standards Students fall into a limited range or band of achievement Does not account for change outside of that range Does not account for student ability before they came to class “ I can do it”
  16. By concentrating on the growth students make, EVAAS puts the emphasis on something educators can influence.
  17. EVAAS value-added modeling is based on the philosophy that all kids count and that schools should not be held responsible for the things they cannot change, like a child’s socio-economic status, and that schools should be responsible for the things they can change, like a child’s growth during a year of schooling. We believe that: --All kids count --All kids can learn --All kids deserve opportunities to make appropriate academic progress every year --Educators can manage their effectiveness to improve student opportunities.
  18. EVAAS value-added modeling is based on the philosophy that all kids count and that schools should not be held responsible for the things they cannot change, like a child’s socio-economic status, and that schools should be responsible for the things they can change, like a child’s growth during a year of schooling. We believe that: --All kids count --All kids can learn --All kids deserve opportunities to make appropriate academic progress every year --Educators can manage their effectiveness to improve student opportunities.
  19. Some people believe that this type of analysis is unfair because it penalizes economically disadvantaged students. This scatterplot illustrates the correlation between student achievement and poverty, which is unfair. Now let’s look at the correlation between student growth and poverty. Discuss ‘fair’ in relation to the data and to teacher evaluation/accountability vs. ‘fair’ delivery of curriculum and instruction to subgroups
  20. Click on slide for answers Which scenario is a better indicator of the effectiveness of the teacher?
  21. 8 minute video embedded from SAS about EVAAS. Video will connect to the web. Hyperlink to the video is below if you experience technical issues. http://www.sas.com/apps/webnet/video-sharing.html?videoToLoad=rtmp://channel.sas.com/vod/clip/9000_EVAAS16x9_rev_F8dotCom&height=295&width=480&posterFrame=null&caption=null&captionShowAtStart=0&referralPage=http%3A//www.sas.com/govedu/edu/k12/evaas/index.html EVAAS was created by a university professor in Tennessee – originally TVAAS – “value-added, effectiveness data.” EVAAS looks at the district, school, teacher, and student. If is often difficult for us to come out of the AYP/ABC box, but it is important to understand that EVAAS data is very different. In the broad sense, EVAAS takes into consideration the fact that students come to us with very different ability levels. Instead of just measuring each student against a proficiency score, EVAAS measures the results of a student’s classroom experience – in essence – EVAAS measures the EFFECT of schooling on student learning.
  22. EVAAS measures the progress students make within your district, school, or classroom, compared to the progress students make, on average, statewide. It is available to all schools and districts in North Carolina. Copyright © 2010, SAS Institute Inc. All rights reserved.
  23. Copyright © 2010, SAS Institute Inc. All rights reserved.
  24. EVAAS helps by allowing educators to: Analyze past program performance for trends Make informed projections for current and incoming students
  25. EVAAS extracts data AFTER DPI collects data through the secure shell. DPI runs processes and checks for validity. Once DPI has completed their processes with the data, they present to the SBE. At this point, data is sent to EVAAS.
  26. Changes in reporting for 2012-13 2011-12 Color coding and descriptors Above (Green) – students in the district made significantly more progress in this subject than students in the average district in NC. Progress was at least two standard errors above average. NDD (Yellow) – Not Detectably Different from students in the average district. Less than two standard errors above average and no more than two standard errors below it. Below (Light Red) – students in the district made significantly less progress in this subject than students in the average district in NC. Progress was more than two standard errors below average. 2012-13 Color coding and descriptors Exceeds Expected Growth (Blue): Estimated mean NCE gain is above the growth standard by at least 2 standard errors. Meets Expected Growth (Green): Estimated mean NCE gain is below the growth standard by at most 2 standard errors but less than 2 standard error above it. Does Not Meet Expected Growth (Red): Estimated mean NCE gain is below the growth standard by more than 2 standard errors. The descriptors in EVAAS now match the Standard 6 Ratings.
  27. We will look at three kinds of reports - value-added, diagnostic, performance diagnostic - at the district level and review how to read them b/c this is the same way you will read your school data. The reports have elements in common and once you can interpret the district reports, you’ll be able to read your school reports easily.
  28. Use this report to evaluate the overall effectiveness of a school on student progress. The School Value Added Report compares each school to the average school in the state. Comparisons are made for each subject tested in the given year and indicate how a school influences student progress in those subjects. * Facilitator preference for the next few slides until break – use only power point or use live site to model – more questions pop up when using the live site
  29. Scores from the EOG tests are converted to State NCEs (Normal Curve Equivalent scores) for the purpose of these analyses. NCE scores have the advantage of being on an equal-interval scale, which allows for a comparison of students' academic attainment level across grades. NCE scores remain the same from year to year for students who make exactly one year of progress after one year of instruction, even though their raw scores would be different. Their NCE gain would be zero.
  30. Student achievement levels appear at the bottom of the report in the Estimated School Mean NCE Scores section. The NCE Base is by definition set at 50.0, and it represents the average attainment level of students in the grade and subject, statewide. Compare the estimated grade/year mean for a school to the NCE Base. If the school mean is greater, the average student in the school is performing at a higher achievement level than the average student in the state.
  31. Caution: subgroup means come from “a liberal statistical process” that is “less conservative than estimates of a district’s influence on student progress in the District Value Added Report”
  32. Use this report to identify patterns or trends of progress among students expected to score at different achievement levels. This report is intended for diagnostic purposes only and should not be used for accountability. Explain that on this report the students are grouped into quintiles. Students are assigned to groups on a statewide basis. The assignment pattern shows schools how their students are distributed compared to other students in the same grade across the state. on the performance as it compares to similar students through out the state.
  33. Click on the underlined number in the Mean or Nr of Students row for a subgroup to see the names of the students assigned to the subgroup Click on the % of Students for the current year or for Previous Cohort(s) to see the data in Pie Chart format. Mean Differences The Mean of the difference between the students’ observed test performance and their predicted performance appears for each Projected Performance Level, along with the Standard Error associated with the Mean . The Standard Error allows you to establish a confidence band around the Mean . A large negative mean indicates that students within a group made less progress than expected. A large positive mean indicates that students within a group made more progress than expected. A mean of approximately 0.0 indicates that a group has progressed at an average rate in the given subject. When the means among groups vary markedly, districts may want to explore ways to improve the instruction for students making less progress.
  34. The Reference Line in the table indicates the gain necessary for students in each Prior-Achievement Subgroup to make expected progress, and it reflects the growth standard. When Gain is reported in NCEs, as it is here, the growth standard is 0.0. The Gain is a measure of the relative progress of the school's students in each Prior-Achievement Subgroup compared to the Growth Standard Standard errors appear beneath the Gain for each Prior-Achievement Subgroup. The standard error allows the user to establish a confidence band around the estimate. The smaller the number of students, the larger the standard error. A student becomes a member of a Prior-Achievement Subgroup based on the average of his or her current and previous year NCE scores. A single student score contains measurement error. Using the average of two years allows a more appropriate assignment. The ro. of Students row shows the number of students in a subgroup. Some subgroups may contain more students than others because students are assigned to groups on a statewide basis. The assignment pattern shows schools how their students are distributed compared to other students in the same grade across the state.
  35. The Pie Chart shows the percent of students in each subgroup and compares their progress to the Growth Standard. Yellow: students in this group progressed at a rate similar to that of students in the average district in the state. Light Red: students in the group made more than one standard error less progress in this subject than students in the average district in the state. Green: the progress of students in this group was more than one standard error above that of students in the average district in the state.
  36. Participants should go to the EVAAS wiki. Under the agenda section have users click on the reflective assessments link. Files to support this portion of the presentation can be found there.
  37. Use to evaluate the overall effectiveness of a school on student progress. Compares each school to the average school in the state. Comparisons are made for each subject tested in the given year and indicate how a school influences student progress in those subjects. Has to be more than -1.8 to be below; more than 2 standard errors to be above. (Provide definition of value-added.) Some things to note are that “like” students are in a subgroup. Show how to read the report. Explain that 0 is the equivalent of one year of growth. Looking at the bottom of the page at the green, yellow, and red descriptors. Explain that there is a blue descriptor when there is not enough data to make a distinction. If your LEA uses DIBELS, the reports have similarities in design with the colors. Have the participants look at the data and talk about how you can see here. Look at trends in the same grade. Look at the students that moved from 6 th grade in 2011 to 7 th grade in 2012 and to 8 th grade in 2012. Talk about what you can take a way from this report.
  38. In this diagram, the two bars have the same height, but the whiskers extend to different lengths. On the left, the whiskers lie completely below the green line, so the group represented by the bar made less than average progress (↓). On the right, the whiskers contain the green line, so the group represented by the bar made average progress (-). The red whisker represents the confidence interval due to the standard error for the mean.  There is a high probability that the actual mean falls somewhere within the Whiskers.  The size of the confidence interval is determined by the sample size.  The larger number of data points the smaller the standard error, smaller the whisker, and therefore the confidence interval.   If the whisker passes over the green line (reference line),the data shows expected growth since there is a chance the mean is actually on the other side of the green line.  It is not certain that the teacher is exclusively above or below the reference line, if the whisker crosses the green reference line. 
  39. In this school, some subgroups of students are not making sufficient gain. Students in the lowest subgroups have not made sufficient progress while high achieving students are making excellent gain. The lack of appropriate progress among low achieving students is a pattern that has been repeated from previous years, indicating a persistent lack of effectiveness with lower achieving students. This is one of the most intriguing components of EVAAS. Common Diagnostic Patterns activity *Pattern Slides are on wiki
  40. In this example, the lowest achieving students are making sufficient progress. Students at an average achievement level are making expected progress. However, the highest achieving students appear to be losing ground. Teachers and administrators will want to find ways to create more progress opportunities for high achieving students.
  41. In this example, high achieving students are making excellent progress. Students who are average in achievement also are making sufficient progress. In contrast, the lowest achieving students are not making as much progress as they should. A pattern like this one will widen the achievement gap. Teachers and administrators should consider how to help lower achieving students gain more ground.
  42. In this example, the students in the middle of the achievement distribution are making sufficient progress, but both lower achieving and higher achieving students are falling behind their peers. In this case, teachers and administrators will want to consider both how to support low-achieving students and how to challenge high-achieving students.
  43. In this example, the opposite of the Tent Pattern, only the lowest and the highest achieving students are making good progress. Students in between have not had enough opportunities for academic growth.
  44. In this example, the students in every achievement group are making sufficient progress in the most recent year, except for the second group. Teachers and administrators will want to consider how to adjust the classroom instruction to meet these students’ needs. In addition, what approaches that are successful with the lowest achieving students could be expanded to include students in the second achievement group?
  45. Have participants draw an ideal pattern On index card, make a box with 1-5 at bottom. Draw the ideal diagnostic report; discuss Ideal to narrow the achievement gap: 1 is highest, descending to 5 Common Diagnostic Patterns activity – look at common patterns; taking turns, explain the pattern to your partner
  46. Print and handout this slide for the next activity on drawing a desirable pattern or have participants use plain paper.
  47. In this example, all bars above the green line indicating the district was highly effective with students in all achievement groups. Additionally, students in the lowest quintile made more progress than students in the other quintiles. Effectively, these students are starting to catch up with their peers; the gap is closing because they are increasing their performance more than a year’s worth of growth.
  48. In this diagram, the two bars have the same height, but the whiskers extend to different lengths. On the left, the whiskers lie completely below the green line, so the group represented by the bar made less than average progress (↓). On the right, the whiskers contain the green line, so the group represented by the bar made average progress (-). The red whisker represents the confidence interval due to the standard error for the mean.  There is a high probability that the actual mean falls somewhere within the Whiskers.  The size of the confidence interval is determined by the sample size.  The larger number of data points the smaller the standard error, smaller the whisker, and therefore the confidence interval.   If the whisker passes over the green line (reference line),the data shows expected growth since there is a chance the mean is actually on the other side of the green line.  It is not certain that the teacher is exclusively above or below the reference line, if the whisker crosses the green reference line. 
  49. Place activity instructions on the wiki. Use the Value-Added and Diagnostic reports to complete the table. Navigate to a Value-Added Report and enter the Tested Subject/Grade name in the Overview of School Effectiveness table below. Locate the color for the most recent year. If the color is RED, place an “X” in the Overall Results column. Use a separate row for each grade for EOG reporting. If your school tests in both EOG and EOC subjects, record the EOG subjects and grades and then choose EOC from the Tests tab. For each test, subject, and/or grade, note the color for the most recent year . If the color is RED, place an “X” in the Overall Results column.
  50. Drill down to the Diagnostic Report for each Tested Subject/Grade. Locate the blue bars on the graph for each of the 5 Achievement Groups. Also note the red whiskers. For any blue bars above the green line (where the whiskers are also completely above), place an up arrow (  ) in the appropriate cell of the Overview of School Effectiveness table. For any blue bars below the green line (where the whiskers are also completely below), place a down arrow (  ) in the table. For any blue bars at or near the green line (the whiskers cross the green line), place a horizontal dash (–) in the table.
  51. These documents will be uploaded to the wiki An extra set of sample data has been loaded on the wiki if participants can’t log into EVAAS
  52. These documents will be uploaded to the wiki
  53. These documents will be uploaded to the wiki
  54. Double check for correct answers
  55. Copyright © 2010, SAS Institute Inc. All rights reserved.
  56. Copyright © 2010, SAS Institute Inc. All rights reserved.
  57. Copyright © 2010, SAS Institute Inc. All rights reserved.
  58. Copyright © 2010, SAS Institute Inc. All rights reserved.
  59. Participants will examine sample data located on wiki site to complete this activity. Locate the blue bars on the graph for each of the 5 Achievement Groups. Also note the red whiskers. For any blue bars above the green line (where the whiskers are also completely above), place an up arrow (  ) in the appropriate cell of the Overview of School Effectiveness table. For any blue bars below the green line (where the whiskers are also completely below), place a down arrow (  ) in the table. For any blue bars at or near the green line (the whiskers cross the green line), place a horizontal dash (–) in the table.
  60. This is a sharing activity – Think-Pair-Share, TTP, Tea Party, etc.
  61. Log back in. Copyright © 2010, SAS Institute Inc. All rights reserved.
  62. Have participants find their school or a school in their district. Identify the patterns for your subjects and grade levels Patterns and explanations can be located on the wiki in the Reflective Assessments portion of the agenda.
  63. Complete the Interpreting Your Results downloadable take home form on the Wiki. Have participants go to the Reflective Assessments section of the wiki and download the Interpreting Your Results documents.
  64. This report is a customized Diagnostic report where you can examine progress for groups of students of your choice. It is only available from the school level and only to users w/access to student reports.
  65. Enables you to see how effective the school has been with lowest, middle and highest achieveing students at least 15 w/predicted and observed scores.
  66. Take a look at this data, what do you notice? What are your thoughts? Higher students did better than expected. Our ML did not do as well as predicted The groups l, m, h placed in 3rds based on their predicted scores and where they fall in the distribution Teacher self reflection how effective were you in teaching based on their predicted score.
  67. This teacher may have had 100% proficiency but we are looking at growth. This teacher did not contribute to the students learning. Students listed by name by subgroups, so that you can identify which group each child falls in. Look at individual students to see if we note any patterns. We can also look at race, gender, and see if some teachers are teaching better to a certain sub-population.
  68. We need to ask some key questions to find out why some students had better growth than others. We could even look at each subgroup individually and think about what contributes to negative and positive growth in the classroom.
  69. In this case we are comparing students in the same sub-group, the group that is considered the H group. Some key questions we might want to ask include: After asking these questions we find that in this particular report the hours that a student participated in a program made positive difference in growth. At this point we looked at number of hours so we ran another report of all 31 students to see if it had a large effect on student growth. We looked at all students that had over 40 hours of enrichment/remediation etc…
  70. The 16 students that had over 40 hours in a program showed far greater growth than their counter parts that did not participate in the program. The 15 that didn’t participate the 40+ really negatively effected the overall growth. If you run a report and this is your result think the next step to figure out what the number actually mean. This shows the program did what you wanted.
  71. These reports may be used to determine local policy for providing targeted intervention and support to students who are at risk for not meeting future academic milestones. At Risk reports for EOG and EOC subj. include students with a 0-70% probability of scoring in the level 3 range. The range for writing in 0-80%. The reports are presented in 3 categories : AYP AT Risk- at risk for not meeting the academic indicators for AYP. EOG M & R grades 4-8. EOC Alg. I and Eng. I. For EOG tests students w/at least 3 prior data points (or test scores) will have projections in M & R in the next grade. These scores are not content specific. Projections for Alg. I and Eng. I may be made as early as 6 th grade with sufficient data. Graduation at Risk- reports for students as risk for not making a level 3 on EOC subjs. Like Alg. 2, Chem., Geom. Phys. Sci, and Pysics. Students that have taken these tests buut have not scored at least level 3 will still have projections to these subjects. Under Reports – Click Academic At Risk Reports These are reports that you will want to spend some time really pouring through.
  72. Same report
  73. for EOG and EOC subjects include students with a 0-70% probability of scoring in the Level III range or 0-80% for writing
  74. 2% of achieving a level 3 on EOC in Alg. I EVAAS can show growth so teachers may want to take on this child to show some on this child to show some serious growth. Have programs in place show great growth for the students. EVERY Kid matters measuring growth not proficiency. Talk about the “clickables” and ways to disaggregate this data: students are listed alpha. w/demographic and other info. You can sort the report by clicking on the underlined column heading. A key to each column headings appears below the report. To see a student report, click on the students name. All students in the report have a 0-70% probability of scoring level 3 in the subject you have chosen (0-80% writing) assuming they have the avg. schooling experience in NC. These students will need support and intervention to provide them with a better than average schooling experience if they are to be successful. Consider different stratege Talk about the defaults
  75. Although projections indicate how a student will likely perform on a future test, their real value lies in how they can inform educators today. By incorporating the projections into their regular planning, teachers, administrators, and guidance counselors can make better decisions about how to meet each student’s academic needs now. Copyright © 2010, SAS Institute Inc. All rights reserved.
  76. When assessing students’ academic needs, educators will want to keep these key questions in mind. Copyright © 2010, SAS Institute Inc. All rights reserved.
  77. Identify students who need to participate in an academic intervention Assess the level of risk for students who may not reach the Proficient mark Plan schedules and resources to ensure that you can meet students’ needs Identify high-achievers who will need additional challenges Assess the opportunities for high-achieving students who are at risk of not reaching Advanced Inform course placement decisions
  78. Have participants access their academic at risk report. Select a grade level and subject to view achievement probability. These students will need support and intervention to provide them with a better than average schooling experience if they are to be successful. Consider different strategies Talk about the defaults
  79. Data mining is sometimes referred to as data or knowledge discovery. Have participants access their academic at risk report. Select a grade level and subject to view achievement probability. Answer the following questions based on your data.
  80. Red dot: Student's testing history. Roll over a dot to see the school and district in which the student was tested. Yellow box: Student's Projected State Percentile, assuming average progress. Performance Level Indicators: Cut score required to be successful at different performance levels, expressed in State Percentiles. See the key below the graph.
  81. Reading left to right, Student's projected State Percentile for the chosen test. Probability for success at different performance levels.
  82. The table shows the student's testing history, across gradesin State NCEs (EOG Math and Reading) or scale score points (all other tests).. For EOC tests, the season in which the test was administered, Fall (F), Spring (Sp), or Summer (Su), is indicated. The year of the test refers to the school year to which the test is attributed. For example, EOC tests administered in the summer and fall of 2010 will be labeled 2011 because they are attributed to the 2010-2011 school year. 3rd grade pretests are considered to measure 2nd grade achievement and are therefore attributed to the previous school year and labeled (2) for 2nd grade.
  83. each student’s achievement quintile based on his/her Projected State Percentile
  84. Notice where each student profiles in the state distribution. That is, identify each student’s achievement quintile based on his/her Projected State Percentile .
  85. Use this report to identify past patterns or trends of progress among students expected to score at different achievement levels
  86. How effective was your school with the lowest two quintiles?
  87. Activity: Use the Bridge to Differentiated Instruction Document This report shows the probability that students within a grade will score at or above Level III on future tests. The table shows the number and percentage of students in each of three probability groups, as well as the number and percentage of students who have already passed the test with a Level III or higher and those with insufficient data for a projection. Green: Students whose probability of proficiency on the chosen test is greater than or equal to 70% Yellow: Students whose probability of proficiency on the chosen test is between 40% and 70% Light Red: Students whose probability of proficiency on the chosen test is less than or equal to 40% Blue: Students who have already passed the test with a Level III or higher. White: Students who do not have a projection, due to lack of sufficient data.
  88. Have participants visit wiki to download step by step instructions.
  89. Post directions on the EVAAS Wiki Copyright © 2010, SAS Institute Inc. All rights reserved.
  90. 1 min. Today’s presentation is being brought to you by… Click
  91. Swap out images for pictures