Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Student testingthatworksfornoone


Published on

Published in: Education, Technology
  • D0WNL0AD FULL ▶ ▶ ▶ ▶ ◀ ◀ ◀ ◀
    Are you sure you want to  Yes  No
    Your message goes here
  • Be the first to like this

Student testingthatworksfornoone

  1. 1. Dr. Mary Byrne, Ed.D. Missouri Coalition Against Common Core October 5, 2013
  2. 2. Local & State National & International  Universal Screening  Tests and Assessments ◦ Formative assessment  Daily, weekly, interval informal classroom-based  Quarterly, formal, computer- based Smarter Balanced Assessment Consortium (SBAC)  Summative Assessment ◦ Classroom Instructional Unit level (teacher or publisher made) ◦ District tests (may be formal, publisher-made standardized tests) ◦ State Year-end testing (for designated grades –  MAP, SBAC)  End of course  National Tests ◦ National Assessment of Educational Progress (NAEP)  International Tests ◦ Programme for International Student Assessment (PISA)
  3. 3. Name Homework Average Chapter 1 Test John 90 70 Bill 50 78 Susan 110 62 Felicia 10 85 Amanda 95 90 Name Objective 1: Write an alternate ending for a story Objective 3: Compare and contrast two stories John Partially proficient Partially proficient Bill Proficient Partially proficient Susan Partially proficient Partially proficient Felicia Advanced Proficient Amanda Partially proficient Proficient tional_leadership/oct08/vol66/num02/S even_Reasons_for_Standards- Based_Grading.aspx Note that two objectives (1 and 3) may require more class instruction. The notations for Objective 2, on the other hand, suggest that the class only needs practice and one student needs some re- teaching Consistent with Linda Darling-Hammond‟s statements about attitudes and values as components of the new school curriculum, other grades separated out for evaluation in school SBG are Citizenship & Collaboration
  4. 4.  Students who struggle can continue to retest and use alternate assessments until they show proficiency, and they are not penalized for needing extended time. I guide students with special needs to modify their work and, if needed, develop different ways of demonstrating that they've met their proficiency goals. Their working styles can be easily accommodated in this system because modified assignments and assessments require no special adjustments in the grade book. The grade book simply shows where they are in meeting the standards, without reference to how they are demonstrating their learning or what modifications needed to be made. October 2008 | Volume 66 | Number 2 Expecting Excellence Pages 70-74 Seven Reasons for Standards-Based Grading Patricia L. Scriffiny
  5. 5.  For the first time in five years, in 2013 Missouri‟s MAP math scores drop ◦ ssouri-test-scores-in-math- drop/article_824d2bc7-bcdd-5edc-a80a- 9fda4016b105.html  and -scores-show-drop-in-math-proficiency-for- columbia-students/  Note: Columbia students have high population of university and medical professionals‟ children
  6. 6.  National Assessment of Educational Progress: ◦ 2011 data from the National Assessment of Education Progress (NAEP). Note that since Governor Nixon‟s adoption of the Common Core State Standards, Missouri‟s reading scores had dropped to the 2002 level. ( /2012454MO4.pdf and
  7. 7.  Programme for International Student Assessment  The PISA exam is one of a handful of tests that compare educational levels across nations, and its considered to be the most comprehensive.  Scores from the 2009 [PISA] . . . Show 15-year-old students in the U.S. performing about average in reading and science, and below average in math. Out of 34 countries, the U.S. ranked 14th in reading, 17th in science and 25th in math.  “This is an absolute wake-up call for America, “U.S. Education Secretary Arne Duncan said . . . “The results are extraordinarily challenging to us and we have to deal with the brutal truth. We have to get much more serious about investing in education.  international-ranking_N.htm
  8. 8. PISA Mathematics The PISA mathematics framework and explanations do not cover the appropriate grade-level material and the released items indicate that the exam is quite weak in mathematical content. Further, PISA developers give only a vague description of what they mean by “mathematical literacy.” This is a problem-solving test and, although mathematics is used, that seems almost incidental. Many problems have no apparent mathematical content. Because of this low level of required content knowledge, the claim that PISA tests “preparedness for further study” is rather dubious. Further, the test itself is unbalanced, overemphasizing data display. Most of the content that is expected of a 15-year-old on PISA is what younger students should have already mastered. As a serious problem-solving test using elementary mathematics, the PISA assessment might function nicely. However, results from PISA ought not to be used to interpret how successful a school system is at imparting grade-level mathematical knowledge and understanding, nor are the PISA framework and released items a suitable model for U.S. standards setters at any level. (p. 2) [emphasis added] Carmichael, S.B., Wilson, W. S., Finn, C.E., Winkler, A. M., & Palmieri, S. (2009). Stars by Which to Navigate, An Interim Report on Common Core, NAEP, TIMSS, and PISA. The Thomas B. Fordham Institute. Retrieved from 20Navigate%20-%20October%202009.pdf
  9. 9. PISA Reading The PISA Reading Framework suffers from a number of problems. It does not address vocabulary development; offer specific expectations for reading and analyzing literary and non-literary texts; include specific expectations for the correct use of English language conventions; recognize the importance of literary heritage; or sufficiently define the amount, quality, and complexity of literary and informational texts that students should read. Much of the content focuses on metacognitive strategies for accessing information and addresses neither reading comprehension nor other critically important strands of English language arts that should be delineated in a set of strong national or state English standards for the U.S. (e.g., literature, writing, research, grammar and conventions, communication skills, etc.). Moreover, the framework is dense and murky, including multiple (and unnecessary) levels of detail about task types. Without any graphical explanation, it‟s difficult to follow and not helpful for informing curriculum and instruction. (p. 2) Carmichael, S.B., Wilson, W. S., Finn, C.E., Winkler, A. M., & Palmieri, S. (2009). Stars by Which to Navigate, An Interim Report on Common Core, NAEP, TIMSS, and PISA. The Thomas B. Fordham Institute. Retrieved from 0by%20Which%20to%20Navigate%20-%20October%202009.pdf
  10. 10. April 14, 2010 Commissioner Chris Nicastro signed the Smarter Balance Assessment Consortium Document of Commitment, committing Missouri to use tests that were not created, thereby, having no demonstrated psychometric quality. . . . And committed the legislature and school districts of Missouri to an unknown product & unknown costs Smarter Balanced has released cost estimates for its assessments that include expenses for ongoing research and development of the assessment system, as well as test administration and scoring. The end-of-year summative assessment alone is estimated to cost $22.50 per student. The full suite of summative, interim, and formative assessments is estimated to cost $27.30 per student. These costs are less than the amount that two-thirds of the Consortium‟s member states currently pay. These costs are estimates because a sizable portion of the cost is for test administration and scoring services that will not be provided by Smarter Balanced; states will either provide these services directly or procure them from vendors in the private sector.
  11. 11. Race to the Top Assessment Smarter Balanced Assessment Consortium Year Two Report May 2013 (p. 5)
  12. 12. High-Fidelity Assessment of Critical Abilities The Common core State Standards identify a number of areas of knowledge and skills that are clearly so critical for college and career readiness that they should be targeted for inclusion in new assessment systems. (p. 7) [We need] . . . The building of character, so that students can . . . Develop the attributes, mindsets, character and values for future success. (p. 1) Darling-Hammond, L., Herman, J., Pellegrino, J., et al. (2013). Criteria for high-quality assessment. Stanford, CA: Stanford Center for Opportunity Policy in Education. Retrieved from assessment_2.pdf
  13. 13.  One of the sample performance tasks in the planned SBAC assessment requires that students take up a social scientific topic about which there are different views, engage in a Google search to find and weigh scientific and historical evidence, evaluate the credibility of the evidence, and develop a cogent essay that takes a position on that topic. Students are also expected to revise their work before it is final. (p. 8) Darling-Hammond, L., Herman, J., Pellegrino, J., et al. (2013). Criteria for high-quality assessment. Stanford, CA: Stanford Center for Opportunity Policy in Education. Retrieved from higher-quality-assessment_2.pdf
  14. 14. . . . develop a cogent essay that takes a position on that topic Protection of Pupil Rights Amendment PPRA] affords parents certain rights regarding any program administered by the Secretary of Education . . . §98.4 Protection of student‟s privacy in examination, testing, or treatment.  (a) No student shall be required, as part of any program specified in § 98.1 (a) or (b), to submit without prior consent to . . . testing, . . . In which the primary purpose is to reveal information concerning . . . (1) political affiliations; . . .
  15. 15. Smarter Balanced identified four achievement levels [note the similarity to SBG], with level 3 for 11th-graders indicating that the student demonstrates “adequate understanding of and ability to apply the knowledge and skills associated with college content-readiness” and that those students performing at level 3 are “exempt from developmental course work, contingent on evidence of continued learning in Grade 12.”4 The [Achievement Level Descriptors] ALDs will be refined, as necessary, following the field test in spring 2014 and the first operational test in spring 2015. (p. 8)
  16. 16. States will be required to develop high-quality items that fully meet all Smarter Balanced guidelines and specifications. Smarter Balanced will be the sole owner of the items developed using this approach. Smarter Balanced will continue to make policy decisions related to accessibility and accommodations for students with disabilities and English learners. The consortium expects to release its guidelines for accessibility and accommodations as well as the consortium‟s common definition of English learner. They will be included in the field test administration manual in fall 2013.
  17. 17.  The United States has emphasized reliability (the scoring is technically correct and consistent) and downplayed tests that accurately measure the more complex standards (validity). Other countries have obtained reliability through professionally scored exam systems, combined with collections of student work that are scored by teachers and checked by professional scorers. Tucker, M. (March, 2010). AN ASSESSMENT SYSTEM FOR THE UNITED STATES: WHY NOT BUILD ON THE BEST? Paper presented at the National Conference on Next Generation K – 12 Assessment Systems (
  18. 18. No one has seen the tests. No one. CATO LIVE, Oct. 3, 2013 Common Core: The Great Debate
  19. 19. What are Accommodations without Validity and Reliability? Smarter Balanced Assessment Consortium: Usability, Accessibility, and Accommodations Guidelines Prepared with the assistance of National Center on Educational Outcomes September 11, 2013,%20A ccessibility,%20and%20Accommodations%20Guidelines.pdf
  20. 20.  In summer 2012, Smarter Balanced contracted with Dr. Stephen Sireci to develop the research plan with input from the TAC [Technical Advisory Committee] members. This plan will establish the Smarter Balanced argument for validity and reliability in making inferences about students‟ college and career readiness. The research plan will be finalized early in 2013. thetop-assessment/reports/sbac- year-2.pdf
  21. 21.  Smarter Balanced Validity Argument: A validity argument is the combination of declarations and related empirical evidence that are needed to support a particular use or interpretation of a test score. Adherence to established blueprints is critical to the Smarter Balanced validity argument. The Smarter Balanced blueprints articulate how the assessment is aligned to the CCSS and provide the basis for the reliability and validity of our reporting claims. Since defensible standard setting is key to establishing validity, the importance of the blueprints in this activity is essential. content/uploads/2011/12/Smarter-Balanced-Preliminary-Test-Blueprints.pdf
  22. 22. CHALLENGES The Department notes, however, that the consortium faced challenges in some areas. Item Development While Smarter Balanced item development is well underway, the consortium also experienced several challenges. During the item review process in fall 2012, the consortium recognized that the review process for ensuring the quality of the items was not sufficient. As a result, the consortium revised the number of items that were developed for the pilot test (from 10,000 to 5,000) so that an additional review could occur to provide a clearer focus on quality and alignment to the CCSS. Moving forward, the consortium is going to be developing 38,000 items in year three for the field test in spring 2014. It is essential that Smarter Balanced maintain a strong process for determining the quality of the items being developed. This will require that the consortium monitor and evaluate the processes for writing and reviewing items as well as for reviewing the quality of the items themselves, and that the consortium include external content experts in English language arts and mathematics as a component of the item development processes. The RFP for field test item writing, released in December 2012, included several of these components to strengthen the consortium‟s quality control measures. (p. 23) assessment/reports/sbac-year-2.pdf
  23. 23. (Ben Stansall/AFP/Getty Images) By Valerie Strauss, Published: September 27 at 12:54 pm gates-it-would-be-great-if-our-education-stuff-worked-but/ “It would be great if our education stuff worked, but that we won’t know for probably a decade.” That‟s what Bill Gates said on Sept. 21 . . . about the billions of dollars his foundation has plowed into education reform during a nearly hour-long interview he gave at Harvard University. He repeated the “we don‟t know if it will work” refrain about his reform efforts a few days later during a panel discussion at the Clinton Global Initiative.
  24. 24. “There is no experimental evidence to back up this dialectical/constructivist view of self-being created by the required assessments being pushed under the Common Core. Or by the OECD [Organization for Economic Co-operative Development] to be considered internationally competitive in the future. In fact, we have to look instead to existential philosophy, meditation, spiritual, and history-of religion literatures to locate proof that the kind of personality we want to use education to create is actually possible.”  decision-problems-imposing-psychological-experiments-on-students/
  25. 25.  Would you say “that sounds like a wonderful mandate for all schools and all students. Here‟s my tax dollars to fund the transformation?”  Well, of course, we wouldn‟t. That‟s the beauty of the misrepresentations surrounding the Common Core and charters with duplicitous language actually mandating Maslow‟s psychological model of growth or the lack of genuine appreciation for what the OECD‟s PISA „test‟ is measuring. It makes the end goal of a revolutionary new purpose for education on automatic pilot towards fruition even though no one would agree to it voluntarily with their own money. Despite the fact that warning after warning is out there in the small print that this is all a massive psychological experiment designed to gain a nonconsensual political and social transformation. Starting at the level of the student‟s personality.
  26. 26. Missouri Comprehensive Data System
  27. 27.  As the dust settles, it looks like Smarter Balanced and PARCC – two consortia who won grants to develop tests aligned to the Common Core – handed over federal funds to Pearson – who is also developing assessments and services aligned to the Common Core. This is what corporate welfare looks like. . . . I would love to be surprised, but I also suspect that the codebase from the “open source tool” will never see the light of day. 
  28. 28.  grading/p/2388560990/youtube-marzano  tent/article/143-standards-based- grading/192-researchlinks-for-standards- based-reporting