Dramatically Better Assessment Systems:  Advice for RTTT “Common Assessment” RFP Brian Gong Center for Assessment Presenta...
My Main Point <ul><li>The future of assessment in the United States will be shaped by what gets funded in this “Common Ass...
Personal recommendations <ul><li>Hedge bets by funding multiple ways to do multi-state common assessment, especially high ...
Short-term and Longer-term Investments <ul><li>Common Assessment RFP should fund </li></ul><ul><li>For implementation by 2...
Implementing a new multi-state summative assessment takes years Gong – USED Common Assessment RFP Input Mtg – 11/17/09 200...
RFP: Specify, Specify, Specify <ul><li>USED should specify its  purpose, theory of action,  and  how the assessment result...
Some Model Systems for 2012 <ul><li>Cross-state comparisons </li></ul><ul><li>Standards-based interpretation </li></ul><ul...
Cross-state Comparisons  (2012) <ul><li>Purpose, TOA, Use: Hold students, schools, LEAs, and states accountable to a commo...
Standards-based Interpretation  (2012) <ul><li>Purpose, TOA, Use: Promote equity through holding students and schools to c...
Inform Better Instruction  (2012) <ul><li>Purpose, TOA, Use: Assess more complex and applied learning (monitor); model and...
Rapid Turn-around  (2012) <ul><li>Purpose, TOA, Use: Promote improvement through rapid feedback to inform actions </li></u...
Measure Growth  (2012) <ul><li>Purpose, TOA, Use: Accountability, program improvement, teacher accountability? </li></ul><...
Teacher/administrator evaluation  (2012) <ul><li>Purpose, TOA, Use: Improve teacher quality by providing feedback (?); use...
Personal recommendations <ul><li>Hedge bets on multiple ways to do multi-state common assessment, especially high school <...
Invest in “Game Changers” - 1 <ul><li>Develop technology that provides evidence of more complex knowledge and skills (i.e....
Invest in “Game Changers” - 2 <ul><li>Develop technology for validity </li></ul><ul><li>Develop complex performance assess...
Invest in “Game Changers” - 3 <ul><li>Develop technology for validity </li></ul><ul><li>Develop complex performance assess...
Invest in “Game Changers” - 4 <ul><li>Develop technology for validity </li></ul><ul><li>Develop complex performance assess...
Invest in “Game Changers” - 5 <ul><li>Develop technology for validity </li></ul><ul><li>Develop complex performance assess...
Invest in “Game Changers” - 6 <ul><li>Develop technology for validity </li></ul><ul><li>Develop complex performance assess...
Invest in “Game Changers” - 7 <ul><li>Technology for validity </li></ul><ul><li>Complex performance assessment </li></ul><...
Personal Recommendation - 2 <ul><li>Invest in five assessment “game changers”  </li></ul><ul><li>Hedge bets on multiple wa...
Hedge bets on 2012 assessment <ul><li>End of course AND Grade 11 survey </li></ul><ul><li>Computer-based AND paper & penci...
RFP Portfolio of Awards <ul><li>Multiple (around 8) strong models that represent advances that can be implemented strongly...
Personal Recommendation - 3 <ul><li>Invest in four assessment “game changers”   </li></ul><ul><li>Hedge bets on multiple w...
Fostering Strong RFP Responses <ul><li>Provide clear RFP specs and different awards for “2012 implementation” and “game ch...
Envision Intended & Unintended Consequences <ul><li>What if in 2012 there were five widely used assessments, all aligned t...
Envision Intended & Unintended Consequences – 2 <ul><li>What if in 2012 each commercially available assessment came in fiv...
<ul><li>Center for Assessment </li></ul><ul><li>www.nciea.org </li></ul><ul><li>Brian Gong </li></ul><ul><li>[email_addres...
Upcoming SlideShare
Loading in …5
×

MS PowerPoint

256 views
216 views

Published on

Published in: Technology, Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
256
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • It will take until 2012 to do what we already know how to do with new content standards; getting multiple states to work together is an added challenge.  The combination of these two demands will preclude any significant innovation in assessments that are supposed to be operational by 2012 for high stakes use, e.g., school accountability.
  • MS PowerPoint

    1. 1. Dramatically Better Assessment Systems: Advice for RTTT “Common Assessment” RFP Brian Gong Center for Assessment Presentation for the Input Meetings Sponsored by the U.S. Department of Education for the “Common Assessment” RFP, “Race to the Top” funding November 17, 2009 Atlanta, GA
    2. 2. My Main Point <ul><li>The future of assessment in the United States will be shaped by what gets funded in this “Common Assessment” RFP. </li></ul><ul><li>USED should shape the RFP and fund it with a longer-term view of having in place dramatically better assessment systems in ten years. </li></ul><ul><ul><li>When USED has to compromise, choose longer-term investments over short-term gains </li></ul></ul><ul><ul><li>Say very clearly what you want in the RFP </li></ul></ul><ul><ul><li>Help foster good responses to the RFP </li></ul></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    3. 3. Personal recommendations <ul><li>Hedge bets by funding multiple ways to do multi-state common assessment, especially high school </li></ul><ul><li>Invest in six “game changers” that could make assessment dramatically better within a decade, but should not be framed as being operationally implemented on the short time schedule (“2012”) </li></ul><ul><li>Help foster good responses to the RFP and after </li></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    4. 4. Short-term and Longer-term Investments <ul><li>Common Assessment RFP should fund </li></ul><ul><li>For implementation by 2012, what we already know how to do in large-scale assessment but </li></ul><ul><ul><li>With new set of content standards </li></ul></ul><ul><ul><li>With groups of multiple states (difficult to do) </li></ul></ul><ul><li>For development through 2015, what we do not know how to do well at scale, but which has potential to lead to dramatically better assessment systems </li></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    5. 5. Implementing a new multi-state summative assessment takes years Gong – USED Common Assessment RFP Input Mtg – 11/17/09 2009 2010 2011 2012 2013 2014 2015 50 state systems , NAEP, TIMMS, PISA, PERLS, many LEA systems, NRTs, ACT/SAT, college’s tests, etc. Award RFP(s) (9/2009) Test Specifi-cations; Develop Items; Use specs, reports, equating design, administration agreements, etc . (2009-10) Pilot Test Items, promulgate high stakes policies , etc. (2010-11) First operational administra-tion & reporting, etc. (2011-12) Second operational administra-tion ; first report using growth, etc. (2012-13) Fourth operational administra-tion; first graduating high school class, etc. (2014-15) Fast Implementation of RFP: 2012 (e.g., multi-state assessments with common content standards, “Peer Review” quality of things we know how to do ) And aligning curriculum, instruction, accountability, and supports takes longer.
    6. 6. RFP: Specify, Specify, Specify <ul><li>USED should specify its purpose, theory of action, and how the assessment results will be used so responders know the big picture </li></ul><ul><li>Specify what is wanted as an deliverable and the set parameters for responders’ creative proposals (e.g., time schedule) </li></ul><ul><li>Specify the means an outcome should be done if USED really wants a specific means </li></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    7. 7. Some Model Systems for 2012 <ul><li>Cross-state comparisons </li></ul><ul><li>Standards-based interpretation </li></ul><ul><li>Inform better instruction </li></ul><ul><li>Rapid turn-around </li></ul><ul><li>Measure growth </li></ul><ul><li>Measure student performance for teacher/administrator evaluation </li></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    8. 8. Cross-state Comparisons (2012) <ul><li>Purpose, TOA, Use: Hold students, schools, LEAs, and states accountable to a common performance standard by triggering sanctions </li></ul><ul><li>Outcome: Statistically robust reports of performance on common metric with no “wiggle room” – stronger than current NAEP mapping studies </li></ul><ul><li>Means: Same content standards, same test specifications, same performance standards, single assessment across states, same administration procedures, strong equating across years </li></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    9. 9. Standards-based Interpretation (2012) <ul><li>Purpose, TOA, Use: Promote equity through holding students and schools to common opportunity-to-learn (content standards) and minimal performance standards </li></ul><ul><li>Outcome: Valid reports of performance related to the designated standards </li></ul><ul><li>Means: Aligned, grade-level only (?), matrix-sampled (?), high school (?), SWD (?), ELL (?) </li></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    10. 10. Inform Better Instruction (2012) <ul><li>Purpose, TOA, Use: Assess more complex and applied learning (monitor); model and encourage instruction (drive) </li></ul><ul><li>Outcome: Incrementally better, more valid and reliable measurement of higher-order, complex student performances (?); more widespread “good” instruction (?) </li></ul><ul><li>Means: Curriculum-embedded assessments (e.g., standardized units, portfolios, graduation projects) (?); curricula with (local) matched assessments (?) </li></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    11. 11. Rapid Turn-around (2012) <ul><li>Purpose, TOA, Use: Promote improvement through rapid feedback to inform actions </li></ul><ul><li>Outcome: Reports of performance useful to decisions and actions, in appropriate timeframe (distinguish actions that are multi-year or annual monitoring from annual rich content analysis from shorter-term uses, down to course grades and student instructional feedback) </li></ul><ul><li>Means: Trade-off speed for quality, cost: greater reliance on multiple-choice/machine-scored; trade-off centralized standardization for complex performances, local scoring; ignore administration variations (e.g., missing students) </li></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    12. 12. Measure Growth (2012) <ul><li>Purpose, TOA, Use: Accountability, program improvement, teacher accountability? </li></ul><ul><li>Outcome: Report of student progress over time related to what is/could be/should be: grade-level standards (?), own starting point (?), other students (?), program supports (?), “teacher’s contribution” (?); how to use in accountability (?) </li></ul><ul><li>Means: Out-of-level testing (?), adaptive testing (?), vertical [moderated] scales (?), use math to predict reading for greater reliability (?), pre- post-measures within year (?) </li></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    13. 13. Teacher/administrator evaluation (2012) <ul><li>Purpose, TOA, Use: Improve teacher quality by providing feedback (?); use in accountability or other high-stakes decisions (?) </li></ul><ul><li>Outcome: Changes in student performance associated with (attributable to ?) specific teachers, administrators, programs </li></ul><ul><li>Means: many statistical approaches (check assumptions, limitations) (?); combine with other information (?) </li></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    14. 14. Personal recommendations <ul><li>Hedge bets on multiple ways to do multi-state common assessment, especially high school </li></ul><ul><li>Invest in six “game changers” that could make assessment dramatically better within a decade, but should not be framed as being operationally implemented on the short time schedule (“2012”) </li></ul><ul><li>Help foster good responses to the RFP and after </li></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    15. 15. Invest in “Game Changers” - 1 <ul><li>Develop technology that provides evidence of more complex knowledge and skills (i.e., more valid) </li></ul><ul><ul><ul><li>E.g., interactive simulations, non-academic knowledge and skills </li></ul></ul></ul><ul><ul><li>Only use technology with an evidence-centered design approach to maintain construct relevance, most students </li></ul></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    16. 16. Invest in “Game Changers” - 2 <ul><li>Develop technology for validity </li></ul><ul><li>Develop complex performance assessment </li></ul><ul><ul><li>Specify extended learning and content, real application contexts, student choice </li></ul></ul><ul><ul><li>Develop credible (local) administration and scoring </li></ul></ul><ul><ul><li>Include all students (and teachers) </li></ul></ul><ul><ul><li>Develop means of certifying validity and reliability, and of combining with other evidence </li></ul></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    17. 17. Invest in “Game Changers” - 3 <ul><li>Develop technology for validity </li></ul><ul><li>Develop complex performance assessment </li></ul><ul><li>Develop curricula that specify “what” and “how” of learning, and associated local assessment systems </li></ul><ul><ul><li>Interim and formative assessments are needed to inform learning directly </li></ul></ul><ul><ul><li>Real assessment problem is informing “What should be done next?” – cannot solve without curriculum and teacher/administrator expertise </li></ul></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    18. 18. Invest in “Game Changers” - 4 <ul><li>Develop technology for validity </li></ul><ul><li>Develop complex performance assessment </li></ul><ul><li>Develop curricula, local assessment systems </li></ul><ul><li>Develop new measurement models and technical criteria for assessments of complex knowledge and skills </li></ul><ul><ul><li>We know current models’ assumptions and limitations; do not impose on innovations! (Example: reliability vs. validity of complex performances; cognitive vs. unidimensional models) </li></ul></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    19. 19. Invest in “Game Changers” - 5 <ul><li>Develop technology for validity </li></ul><ul><li>Develop complex performance assessment </li></ul><ul><li>Develop curricula, comprehensive assessment systems </li></ul><ul><li>Develop new measurement models and technical criteria </li></ul><ul><li>Develop better accountability models and support better use of assessment results for program improvement </li></ul><ul><ul><li>Assessments, assessment use, and instruction are being distorted by our current accountability model </li></ul></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    20. 20. Invest in “Game Changers” - 6 <ul><li>Develop technology for validity </li></ul><ul><li>Develop complex performance assessment </li></ul><ul><li>Develop curricula, local assessment systems </li></ul><ul><li>Develop new measurement models and technical criteria </li></ul><ul><li>Develop better models of accountability and program improvement </li></ul><ul><li>Develop model specifications for a coherent comprehensive assessment system that incorporates above five </li></ul><ul><ul><li>e.g., NAEP, state, </li></ul></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    21. 21. Invest in “Game Changers” - 7 <ul><li>Technology for validity </li></ul><ul><li>Complex performance assessment </li></ul><ul><li>Curricula & comprehensive assessment systems </li></ul><ul><li>New measurement models and technical criteria </li></ul><ul><li>Better accountability models and support for program improvement </li></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    22. 22. Personal Recommendation - 2 <ul><li>Invest in five assessment “game changers” </li></ul><ul><li>Hedge bets on multiple ways to do multi-state “2012” common assessment, especially high school </li></ul><ul><ul><li>Good current models: all MC, mixed MC-CR, computer-based, end-of-course, survey, etc. </li></ul></ul><ul><ul><li>Interwoven with state policies (e.g., high school exit requirements) </li></ul></ul><ul><li>Help foster good responses to the RFP and after </li></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    23. 23. Hedge bets on 2012 assessment <ul><li>End of course AND Grade 11 survey </li></ul><ul><li>Computer-based AND paper & pencil </li></ul><ul><li>All multiple choice AND modest short CR AND larger amount and more extensive CR </li></ul><ul><li>Fund multiple “common content standards” </li></ul><ul><ul><li>To find out costs and benefits of multi-state common assessments </li></ul></ul><ul><ul><li>Because no one set of content standards is clearly superior </li></ul></ul><ul><ul><li>Because no one approach is clearly superior </li></ul></ul><ul><ul><li>Because reporting on a common score metric is less important </li></ul></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    24. 24. RFP Portfolio of Awards <ul><li>Multiple (around 8) strong models that represent advances that can be implemented strongly by 2012 and that help get to the longer-term goal </li></ul><ul><ul><li>Consider strategy: Do not fund strong models that will be adopted even if not funded </li></ul></ul><ul><li>Multiple (perhaps 12) strong “game changer” awards </li></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    25. 25. Personal Recommendation - 3 <ul><li>Invest in four assessment “game changers” </li></ul><ul><li>Hedge bets on multiple ways to do multi-state common assessment, especially high school </li></ul><ul><li>Help foster good responses to the RFP and after </li></ul><ul><ul><li>If USED wants certain outcomes of states working together, then promote leadership to make that happen among states, NGOs, test vendors, etc. </li></ul></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    26. 26. Fostering Strong RFP Responses <ul><li>Provide clear RFP specs and different awards for “2012 implementation” and “game changers” </li></ul><ul><li>If USED wants states to have vendor partners in their RFP responses, need to indicate that early and facilitate it well (vs. states’ issuing an RFP) </li></ul><ul><li>USED should think about what states who don’t get RTTT common assessment funds will do </li></ul><ul><li>USED should think how what it funds will be adopted after RTTT and how that will shape what is available in the future </li></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    27. 27. Envision Intended & Unintended Consequences <ul><li>What if in 2012 there were five widely used assessments, all aligned to the same common content standards </li></ul><ul><ul><li>Four were commercially available from current test publishers (like the Achieve/Pearson Algebra 2 end-of-course exam) </li></ul></ul><ul><ul><li>One was available by joining a consortium (like the WIDA ELP exams) </li></ul></ul><ul><ul><li>States were purchasing elementary math from one vendor and high school English from another vendor </li></ul></ul><ul><li>What if there were only one assessment being used? What if there were 46? </li></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    28. 28. Envision Intended & Unintended Consequences – 2 <ul><li>What if in 2012 each commercially available assessment came in five versions: </li></ul><ul><ul><li>An all multiple-choice, computer-administered short form that took 20 minutes and cost $3/per student </li></ul></ul><ul><ul><li>An all multiple-choice, computer or paper & pencil form that took 50 minutes and cost $7/per student </li></ul></ul><ul><ul><li>A computer or p & p version that took 120 minutes, had 40 multiple choice, 8 short constructed response, and 4 extended constructed response items and cost $15/per student </li></ul></ul><ul><ul><li>A computer of p & p version that took 150 minutes, had 40 multiple choice, 4 extended constructed response, and 2 long constructed response items and cost $60/per student </li></ul></ul><ul><ul><li>A version that included a standardized test like option 3 and had a curriculum-embedded project and other performance evidence that was centrally audited and cost $200/per student </li></ul></ul>Gong – USED Common Assessment RFP Input Mtg – 11/17/09
    29. 29. <ul><li>Center for Assessment </li></ul><ul><li>www.nciea.org </li></ul><ul><li>Brian Gong </li></ul><ul><li>[email_address] </li></ul>For more information: Gong – USED Common Assessment RFP Input Mtg – 11/17/09

    ×