SC Assessment Summit March 2013

877 views

Published on

Covers a wide variety of updates on various NWEA products and services

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
877
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • multiple choice items hand aligned by the NWEA Content Services Team - This has resulted in a tight alignment to the standards at a granular level in each content areaItems directly aligned to grade level standard(s)Assures item measures appropriate content and is appropriate for item poolAllows us to do a gap analysis finding standards with few alignments. MAP is M/C only.Guides new item development and pool enhancementAllow us to demonstrate alignment Items in pool have no grade association—it is a MAP pool
  • The purpose of this slide is to show our ongoing progression of CCSS aligned MAP – and the introduction of our new Blended Assessment solution.
  • Assessment alignment to curriculum/instruction is fundamental – performance may be affected if there is misalignmentWhat about or will we see a drop in RIT scores? - Short video
  • Below are a list of questions that you can choose when appropriate for your partner:How will teachers and students understand where and how learning is happening?How will the Consortia support teachers? Are the online resources the only planned offering? When will the Consortia have outside evidence that their systems are accurate?What is the plan for scoring performance assessments? What research will support this scoring?Are scores from SBAC’s short and long test versions intended to inform the same decisions?  How precise will the underlying scores be for supporting/informing decisions?
  • Possible to use INTEL tool to rank order these??? - http://www.intel.com/content/www/us/en/education/k12/thinking-tools/visual-ranking/overview.htmlShort videoAdditionally – PD to support use of data, strategies & techniques for differentiating and embedding formative assessment minute-to-minute & day-by-dayStandards are very skill based – curriculum is what makes our work unique – individualized to our students and our students’ needs – supportive of personalized learning environments; More focus, more coherence, and more rigor – building habits of mind
  • These features are very relevant when competing in the Consortia world – SBAC and PARCC are missing most all of these distinctions.
  • CUSTOMIZE THIS SLIDE – either PARCC OR SBAC.Below are a listed of suggested questions that you can choose to use as appropriate for your partners:When will Consortia states have a valid method to predict performance on summative assessments?When will the Consortia states have cut scores?What are the plans for scale development?How will states get longitudinal growth data?How will teachers determine growth targets for every child?How will teachers and students track improvement and learning during the year?Where are SBAC and PARCC’s research plans?How soon will norms be available? Are there plans to develop growth and status norms?
  • Green line is their VA estimate and bar is the error of measureBoth on top and bottom people can be in other quartilesPeople in the middle can cross quintiles – just based on SEMCross country – winners spread out. End of the race spread. Middle you get a pack. Middle moving up makes a big difference in the overall race.Instability and narrowness of ranges means evaluating teachers in the middle of the test mean slight changes in performance can be a large change in performance ranking
  • We heard from many of you that your situation is not exactly typical and you wanted a way to compare the growth of your students to something that was matched to your situation – an apples to apples comparison. That’s why we created Virtual Comparison Groups or VCG’s. {Click}This is a sample report at the classroom level that shows individual students and how they did compared to similar students in similar schools and compared to the state proficiency standards.The black is the actual growth of each student. The grey is the growth of a randomly selected group of students matched on both student and school criteria like grade, subject, starting achievement, school free and reduced lunch percentage, and school location – is it an urban school or a rural school?The green and blue lines provide a reference point to how your state classifies your student’s achievement.VCG reports like these, and ones showing higher levels too, come along with a simple electronic tool for further exploring the data.
  • Whether showing individual students in a classroom or grades in a district, think about how VCGs can change the conversations. When comparisons are apples to apples, people now know what is possible. Conversations can focus on individual students or on grade levels as part of one on one conversations or as part of a school improvement process.I can explain more about VCGs if you want to see me later or look for an email that’s coming in early April with more information.
  • SC Assessment Summit March 2013

    1. 1. NWEA’sSouth Carolina Assessment SummitMarch 25-28, 2013
    2. 2. Slidesharehttp://www.slideshare.net/NWEA/sc- assessment- summit-march- 2013
    3. 3. Your NWEA Team Laura Riley Sue Madagan Alison Levitt Andy Hegedus
    4. 4. NWEA Update Updated SC Common New ItemsLinking Study Core Teacher VirtualField Testing Evaluation Comparison Growth College Norms Readiness
    5. 5. MAP Aligned with Common Core: Ongoing Process The Early Years The Bridge Years(2010) A Review and Validation of the (2012) State aligned versions built for Common Core Standards states who adopted CCSS +15%PRESENTED TO CCSSO, ACHIEVE, AND THE NATIONAL GOVERNORS’ ASSOCIATION Technology Enhanced Items under development(2011)NWEA releases first Common Core version of MAP- (2013) New test versions being built to• Hand-aligned- resulting in tight reflect grade level standards alignment in each content area Stand Alone Field Testing commences
    6. 6. MAP Aligned with Common Core: Ongoing Process Assessment Release Timing Alignment & Item Types VersionMAP Aligned with Fall 2012 •Better content coverage.CC, Version Three •Deeper assessment of a student’s depth of knowledge (DOK).MAP Aligned with •Increase item/test validity.CC, Version Four Fall 2013 •More engaging test experience for students. Blended TBD Measures both proficiency and Assessment growth . Release a new MAP assessment solution based on the CCSS that will provide kids, educators and parents with both on-grade performance and off- grade performance information
    7. 7. South Carolina CCSS TimelineSchool Year Implementation Phase2011-2012 Transition Year2012-2013 Transition Year2013-2014 Bridge Year2014-2015 Full Implementation
    8. 8. Impact on Data and ReportsFunctional Area ImpactGrowth No impact, comparisons are made at the measurement scale levelMeasurement and are not affected by changes in state standards Updated SC linking study will continue to be used.Projected New linking study will be conducted once there is a sufficientProficiency number of students who completed the CCSS (+15%) aligned state Assessments.Goal Level No impact, automatically aggregate the data by goal structureReporting Content-independent (used across multiple state’s standards)Norms Can use these norms for Common Core AssessmentDesCartes/ No impact, DesCartes/PGID will be available for the currentlyPGID licensed Assessments
    9. 9. NWEA Update Updated SC Common New ItemsLinking Study Core Teacher VirtualField Testing Evaluation Comparison Growth College Norms Readiness
    10. 10. Link to Additional Item Typeshttp://www.nwea.org/cc_sample/item2.html • Common Stimulus • Hot Spot • Drag and Drop • Click and Pop • Graphing Calculator • Keyboard Entry • Drop Down Lists • Turn and Slide • Multiple Enhancements
    11. 11. Technology-Enhanced Items With InteractiveElements http://www.nwea.org/common-core-new-item-types-map • Better content coverage • Deeper assessment of a student’s depth of knowledge (DOK) • Increase item/test validity • More engaging test experience • Interactive elements supported in July 2012 – Drag & drop – Click and pop – Hot spot 11
    12. 12. Common Stimulus Items • Share a common item asset with other items • Chosen adaptively; if selected, a number of items associated with the common stimulus item are presented consecutively • Common stimulus items available in July 2012 – Use passages as the common asset – Allow deeper assessment of reading comprehension 12
    13. 13. Sample Item - Common Stimulus Item
    14. 14. Sample Item - Common Stimulus Item
    15. 15. Sample Item - Common Stimulus Item
    16. 16. Sample Item - Drag & Drop Item
    17. 17. Sample Item - Drag & Drop Item
    18. 18. Sample Item - Drag & Drop Item
    19. 19. Sample Item - Drag & Drop Item
    20. 20. Sample Item - Click & Pop Item
    21. 21. Sample Item - Click & Pop Item
    22. 22. Sample Item - Click & Pop Item
    23. 23. Sample Item - Click & Pop Item
    24. 24. Sample Item - Hot Spot Item
    25. 25. Sample Item - Hot Spot Item
    26. 26. Sample Item - Hot Spot Item
    27. 27. Sample Item - Hot Spot Item
    28. 28. Link to Additional Item Types • Graphing Calculator • Keyboard Entry • Drop Down Lists • Turn and Slide • Multiple Enhancements
    29. 29. Graphing Calculator
    30. 30. Keyboard Entry
    31. 31. Drop Down Menu
    32. 32. Turn and Slide
    33. 33. Multiple Enhancements
    34. 34. FAQ’s about NWEA’s CCSS• When should our districts consider switching over to CCSS MAP tests? – You may change when it’s the best fit/time for YOUR DISTRICT.• Can we give SOME students the CCSS version and other students the state version that we currently give? – Yes. You may test any grade level you wish on Common Core and any other grade levels on your state version.• When must we inform NWEA of our choice to move to CCSS or to stay with our state version? – We would like to know your decision about one-two weeks prior to your district’s downloading of the next season of data.
    35. 35. FAQ’s about NWEA’s CCSS• How will proficiency to CCSS be forecasted? – Until students are scored under the new CCSS, we will use the 40/70 cuts.• How will Reports change if we switch to Common Core? – Teacher reports and all other reports that list goal strands will now list the goal strands as defined by the Common Core instead of the strands aligned to your state goal structures.
    36. 36. FAQ’s about NWEA’s CCSS• Do the 2011 Norms apply to the Common Core aligned tests? – The 2011 norms are carefully constructed to be independent of any specific test.• How can we make the change to Common Core MAP tests? – The process is simple. Just call/email Laura Riley or Sue Madagan at NWEA
    37. 37. ResourcesRecorded Webinar- Expires May 13, 2013• A Guide to MAP and the Common Core for Teachers: http://nwea.adobeconnect.com/cc1_resources/• A Guide to MAP and the Common Core for Leaders: http://nwea.adobeconnect.com/cc2_resources/• Handout- Resources- Guide to Common Core and Measures of Academic Progress (MAP): What Leaders Need to Know
    38. 38. Bridging the Gaps: What gaps exist in your district between what youneed and what’s provided?
    39. 39. Things to consider in SC…..• No K-2- New Literacy Movement in state• Longitudinal Growth Measures- historical data• Growth Projections• Predictive Capabilities- Linking Studies (ACT and future Common Core)• Blended Assessment- growth and proficiency• Links to Instructional Providers- Compass, E2020, Study Island• Transition YearNWEA is Distinctive
    40. 40. Things to consider in SC…..• Interim Assessment- optional• Adaptability- accurate for high and low performing students – . Summative- only grade level results• Norms- National Norms - 30 million kids (so far)• Valid and reliable status and growth data necessary for Teacher Evaluation- supports Teacher Effectiveness Models• GRD- Growth Research Data Base-most extensive collection of student growth data in the country. 4.5 Billion pairs of test items and responses. NWEA is Distinctive
    41. 41. Understanding yourDistrict’s unique needs?
    42. 42. NWEA Update Updated SC Common New ItemsLinking Study Core Teacher VirtualField Testing Evaluation Comparison Growth College Norms Readiness
    43. 43. Three primary conditions for using tests for teacher evaluation1. Selection of an appropriate test: • Used for the purpose for which it was designed (proficiency vs. growth) • Can accurately measure the test performance of all students2. Alignment between the content assessed and the content to be taught3. Adjust for context/control for factors outside a teacher’s direct control (value-added)
    44. 44. Range of teacher value-added estimates 12.00 11.00 Mathematics Growth Index Distribution by Teacher - Validity Filtered 10.00 9.00 Each line in this display represents a single teacher. The graphic shows the average growth index score for each teacher (green 8.00 line), plus or minus the standard error of the growth index estimate 7.00 (black line). We removed students who had tests of questionable validity and teachers with fewer than 20 students. 6.00 5.00Average Growth Index Score and Range 4.00 Q5 3.00 2.00 Q4 1.00 0.00 Q3 -1.00 -2.00 Q2 -3.00 -4.00 Q1 -5.00 -6.00 -7.00 -8.00 -9.00 -10.00 -11.00 -12.00
    45. 45. Leadership Courage Is A Key Ratings can be driven by the assessment5 Real4 or Noise?3210 Teacher 1 Teacher 2 Teacher 3 Observation Assessment
    46. 46. NWEA Update Updated SC Common New ItemsLinking Study Core Teacher VirtualField Testing Evaluation Comparison Growth College Norms Readiness
    47. 47. Problems we can help you solve• I know my students didn’t make typical growth . . . my school and my students aren’t typical – Need an apples to apples comparison to demonstrate what is possible – A Proof Point 47
    48. 48. What if we could compare to similar schools and students?We identify yourIdentify all matching students from GRDschools and studentsSchool Income, Urban vs. Rural ClassificationGrade, Subject, Starting Achievement, AssessmentRandomly select comparison groupdates 48
    49. 49. Virtual Comparison Groups (VCGs)• Compare your student’s growth to similar students in similar schools• Compare to State cut scores• Compare to a catch-up Growth Target 49
    50. 50. Virtual Comparison Groups (VCGs) Think about the power in theseconversations 50
    51. 51. Ensure similar testing conditions since they matter in the reliability of results8.00 Mean value-added growth by school6.004.002.000.00-2.00-4.00-6.00 1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51 53 55 57 59 61 63 65 67 69 71 Students taking 10+ minutes longer spring than fall All other students
    52. 52. Virtual Comparison Groups• Graphs at classroom, school, district level – Two hours of remote support – Tool with student and testing condition data• Optional – Data Coaching – Research Consultation 52
    53. 53. Problems we can help you solve• I think our students are learning more now than before, but I don’t have the time or expertise to prove it – Need rigorous and defensible analysis of your longitudinal growth and achievement data 53
    54. 54. Learning Pattern Reporting• Supports your work – District and School Improvement – Communication with Board, Administrators, Parents and Community• We provide you a defensible analysis of your growth and achievement data – Grades 2 through 8 – Contiguous grades – Four or more years – Fall and Spring
    55. 55. What do we do?• We use all your Fall and Spring MAP data – All students are linked and data is cleaned – Broken into over-lapping four year blocks of data, each being lagged by one year Grade 5 4 3 2 F S F S F S F S F S F S 2007 2008 2009 2010 2011 2012 School Year Data-block 1 Data-block 2 55 Data-block 3
    56. 56. What do we do? (cont.)• A statistical growth model is then fitted including both achievement and growth• Represents the “real” school Grade 5 Age-cohort 4 RIT 3 2 Grade-level F S F S F S F S 2009 2010 2011 2012 School Year 56
    57. 57. Analyze your data and find patterns in growth and achievement 57
    58. 58. Findings reported using defensible indicators about your patterns 2011 2012 2010 2010 2012 2011
    59. 59. Learning Pattern Reporting• Report on patterns found in your data – Four hours of remote support – Tool you can use• Optional – Data Coaching – Research Consultation 59
    60. 60. NWEA Update Updated SC Common New ItemsLinking Study Core Teacher VirtualField Testing Evaluation Comparison Growth College Norms Readiness
    61. 61. South Carolina ESEA Waiver Linking Study NWEA Updated January 2013
    62. 62. 4 Principles for ESEA Waiver approval• Demonstrate CCR expectations for all students• Develop high-quality plans to implement a system of differentiated recognition and accountability and support for all Title 1 districts and schools
    63. 63. 4 Principles for ESEA Waiver approval• Commit to developing, adopting, piloting and implementing teacher and principal evaluation systems that support student achievement.• Provide assurance that you will evaluate and, if needed, revise administrative requirements to avoid duplication and unnecessary burden to the district.
    64. 64. ESEA Waiver Timeframe• Arne Duncan currently has approved the waivers through the 2013-14 school year. At that time, districts may request extensions.
    65. 65. SC ESEA Waiver Study• Annual Measurable Objectives (AMO) are now based on school level mean scale scores rather than the percent of students meeting their proficiency targets.• (see graphic on next screen)
    66. 66. AMO increases• As the AMO increases over the next several years, so will the mean RIT scores increase to meet the escalating demand.• NWEA correlation tables include data through the 2017-2018 school year.
    67. 67. Increasing AMO’s
    68. 68. Example Table 2 (below) contains the 2012- 2013 elementary-level AMO targets and the associated NWEA RIT targets by grade for a school.While this study is not able to directly estimate the probability of anelementary school meeting its AMO target, the school could be considered ontrack for success if all students in grades three, four and five have NWEA RITscores greater than 206, 212 and 224 respectively.
    69. 69. Performance Level Projections2011 - 2014
    70. 70. Performance Level Projections2014 - 2018
    71. 71. AMO vs. PASS Proficiency Pass Linking StudyESEA Study
    72. 72. Proficiency Probability
    73. 73. Correlation and Accuracy by year
    74. 74. Questions? Laura RileyAccount Manager, NWEA Laura.Riley@nwea.org 317-893-2413 Thank you!
    75. 75. NWEA Update Updated SC Common New ItemsLinking Study Core Teacher VirtualField Testing Evaluation Comparison Growth College Norms Readiness
    76. 76. Partnering to help all kids learn Paths to Postsecondary
    77. 77. NWEA ResearchThe NWEA Research Team created an alignmentstudy between students who have valid MAP scoresand also have valid Explore, Plan and ACT scores.The results showed a correlation between ACTEntrance scores, MAP RIT scores and the year-to-year growth path to achieve the desired Entrancescore.
    78. 78. Where did the numbers come from?• Active NWEA districts that use EXPLORE, PLAN, and ACT• ACT data was matched to corresponding MAP data at the individual level• No formal sampling strategies employed other than to cut extreme residuals
    79. 79. ACT Cuts: (Reading)
    80. 80. ACT Cuts: (Language)
    81. 81. ACT Cuts: Math
    82. 82. ACT Probability Tables
    83. 83. ACT Says: • The ACT composite “entrance” scores used are scores of students who the ACT data indicates have a 50% likelihood of achieving a “B” average in a freshman-level course.• The demands of the courses are different in various post-secondary institutions.
    84. 84. “Types” of Post-Secondary Institutions in these Examples: A “State” CollegeA “Top-Tier”Public University An Elite, “Ivy League” Institution
    85. 85. ACT Entrance Score for: A “State” CollegeEntrance ACT Reading/MathComposite =24
    86. 86. ACT Entrance Score for: A “Top-Tier” Public UniversityEntrance ACT Reading/MathComposite = 29
    87. 87. ACT Entrance Score for: An Elite, “Ivy League” InstitutionEntrance ACT Reading/MathComposite = 32
    88. 88. Student “Paths”Introducing Three Normal Students,and their (potential)Postsecondary Paths Theodore Thirdgrader Sandra Seventhgrader Nate Ninethgrader
    89. 89. APPROPRIATE Postsecondary PathsEach child isunique. Each Postsecondary Path will be different and MUST be appropriate for each individual student.
    90. 90. Theodore Thirdgrader’s PathNWEA data indicates that for Theodore to achieve theentrance Composite ACT score for these institutions, hisspring RIT score should approach:Entrance ACT Spring RIT24 213 (78th %ile)Entrance ACT Spring RIT29 224 (99th %ile)Entrance ACT Spring RIT32 229 (99th %ile)
    91. 91. Another Way to Look at Third Grade Spring RIT ScoresAverage ACT Compositeentrance score for anEducation Major is 20.8 Third grade spring RIT for a student on a 20.8 trajectory is 209 Average ACT Composite entrance score for an Engineering Major is 23.7 Third grade spring RIT for a student on a 23.7 trajectory is 219
    92. 92. Nate Ninthgrader’s Path NWEA data indicates that for Nate to achieve the entrance Composite ACT score for these institutions, his spring RIT score should approach:Entrance ACT Spring RIT24 237 (53rd %ile)Entrance ACT Spring RIT29 246 (71st %ile) Entrance ACT Spring RIT 32 251 (79th %ile)
    93. 93. Another Way to Look at Ninth Grade Spring RIT ScoresAverage Compositeentrance score for anEngineering Major is 23.7 Ninth grade Spring RIT for a student on a 23.7 trajectory is 254
    94. 94. Questions? Laura RileyAccount Manager, NWEA Laura.Riley@nwea.org 317-893-2413 Thank you!
    95. 95. NWEA Update Updated SC Common New ItemsLinking Study Core Teacher VirtualField Testing Evaluation Comparison Growth College Norms Readiness
    96. 96. Standalone Field Testing• Improve ability to assess current and new standards• Introduce your students to innovative new technology-enhanced assessment items• Increase the breadth and depth of MAP® item pool• Partner with NWEA to maintain the integrity of the RIT scaleEARN CREDITS TOWARD PRODUCT AND SERVICES FOR PARTICIPATION! © 2013 by Northwest Evaluation Association
    97. 97. Current NeedsWe are currently seeking students to participate in fieldtests and research studies in: – Spanish Math – grades 2-5 (must be able to read Spanish) – Math and Reading constructed response – grades 4-10 – Writing Prompt – grades 4-10 – MAP Reading and Math on iPads – grades 4, 6, 8 – Technology–enhanced items - Math – grades 2-10 • Coming back in Fall 2013: Reading, Language Usage, and General Science SIGN UP NOW – LIMITED AVAILABILITY! © 2013 by Northwest Evaluation Association
    98. 98. Participation Requirements• Do not have to be a current MAP Partner• No limit to the number of students who can participate• Can take multiple tests in different subject areas• Each test will take ~ 1 hour• PC or Macs must meet minimum technical requirements (same as MAP)• Constructed Response testing - short pre-testing webinar for proctors required• Mobile assessment participants must provide their own iPads MORE INFORMATION AT WWW.NWEA.ORG/SFT © 2013 by Northwest Evaluation Association
    99. 99. For More Information• Website: www.nwea.org/sft• Contact your Account Executive or Account Manager• Email questions/requests to the Field Test Recruiting Team at fieldtest@nwea.org WE APPRECIATE YOUR PARTNERSHIP! © 2013 by Northwest Evaluation Association
    100. 100. Thank you foryour continuedpartnership with NWEA. Sincerely, Your NWEA Team Laura Riley Sue Madagan Alison Levitt Andy Hegedus

    ×