2. Definition Descriptions of the successively more sophisticated ways of thinking about a topic that can follow one another as children learn about and investigate a topic over a broad span of time (e.g., 6 to 8 years). (Duschl et al, 2007) “… they lay out in words and examples what it means to move toward more expert understanding” (Wilson & Bertenthal, 2005)
3. Definition “… are empirically grounded and testable hypotheses about how students' understanding of, and ability to use, core scientific concepts and explanations and related scientific practices grow and become more sophisticated over time, with appropriate instruction” (Duschl et al, 2007).
4. Five Essential Characteristics 1.Clear end points defined by social aspirations and the central concepts and themes in the discipline 2. Progress variables that identify the critical dimensions of understanding and skill 3. Stages of progress that define significant intermediate steps in conceptual/skill development 4. Learning performances which are the operational definitions of what children's understanding and skills look like at each of these stages of progress 5. Assessments that measure student understanding of the key concepts or practices and track their developmental progress over time.
5.
6.
7.
8. Origins of the Idea/Key Milestones 1997 2002 2003 2004 2005 2006 2007 1996 Research Policy / Publications Development of Progressions Hughes – Progression in Learning Masters & Forster Roberts, Wilson & Draney NCLB Commissioned Paper– Catley, Lehrer & Reiser: Evolution Wilson – Construct Maps NRC Report: Systems for State Science Assessment NRC Report: Taking Science to School Commissioned Paper – Smith, Wiser, Anderson & Krajcik: Atomic Molecular Theory CPRE launched CCII – move toward adaptive instruction
9. Key People Mark Wilson, University of California, Berkley James Pelligrino– University of Illinois - Chicago Leona Schauble & Richard Lehrer, Peabody Campus – Vanderbilt University Carol L. Smith, University of Massachusetts at Boston Tom Corcoran, Teachers College – Columbia University Alicia Alonzo, University of Iowa Joseph Krajcik, University of Michigan Richard J. Shavelson, Stanford University
10. Influential Articles Wilson, M (2009). Measuring Progressions: Assessment Structures Underlying a Learning Progression. Journal of Research in Science Teaching, (46) 6, 716-730 Mohan, L., Chen, J., Anderson C. (2009) Developing a multi-year learning progression for carbon cycling in socio-ecological systems. Journal of Research in Science Teaching, (46) 6, 675-698 Catley, K., Reiser, B., and Lehrer, R. (2005). Tracing a prospective learning progression for developing understanding of evolution. Commissioned paper prepared for the National Research Council’s Committee on Test Design for K–12 Science Achievement, Washington, DC. Smith, C., Wiser, M., Anderson, C., Krajcik, J. (2004) Implications of Research on Children’s Learning for Standards and Assessment: A Proposed Learning Progression for Matter and the Atomic-Molecular Theory . Commissioned paper prepared for the National Research Council’s Committee on Test Design for K–12 Science Achievement, Washington, DC. Wilson, M. & Draney, D. (2004). Some Links Between Large-Scale and Classroom Assessments: The case of the BEAR Assessment System. Towards Coherence Between Classroom Assessment and Accountability. NSSE Yearbook, 103, Part II
11. Influential Publications Corcoran, T., Mosher, F.A., & Rogat, A. (2009). Learning progressions in science: An evidence based approach to reform. NY: Center on Continuous Instructional Improvement, Teachers College—Columbia University National Research Council. (2005). Systems of State Science Assessments. Committee on Test Design for K-12 Science Achievement. M. R. Wilson & M. W. Bertenthal (Eds.). Washington, D. C.: National Academy Press. National Research Council. (2001). Knowing what students know: The science and design of educational assessment. Committee on the Foundations of Assessment. J. Pellegrino, N. Chudowsky, & R. Glaser (Eds). Washington, DC: National Academy Press.
14. Learning Progressions & The Assessment Triangle Observation Interpretation Cognition Item Design Construct Maps Outcome Space Measurement Model
15. Observation Interpretation Cognition Construct Maps (Specification of the Construct) The working definition of what is to be measured Learning Performances: suggest connections between the conceptual knowledge in the standards and related abilities and understandings that can be observed and assessed ‘‘ Developmental perspective’’ regarding student learning - assessing the development of student understanding of particular concepts and skills over time
16.
17. Observation Interpretation Cognition Outcome Space Measurement Model Qualitatively different levels of responses to items and tasks associated with different levels of performance The basis on which assessors and users associate scores earned on items and tasks with particular levels of performance. The measurement model must relate the scored responses to the construct. Scoring guides for student responses to assessment tasks Example: Item Response Theory
Editor's Notes
TRACY
TRACY What the definitions share: Vertical development over an extended period of time Learning is envisioned as a development of progressive sophistication in understanding and skills within a domain No references to grade or age level expectations Learning is conceived as a sequence or continuum of increasing expertise
COREY
COREY
COREY
Martin Hughes – Progression in Learning – “about short-term and long-term changes in children’s knowledge and understanding which take place as they progress through various domains of learning. 5 studies – CH1: Penny Munn – progression in preschool conceptions of literacy and numeracy; CH2: progression btw 6-13 I understanding of two distinct areas of math and science CH 3: progression btw 7-14 in iddeas about historical enquiry and explanation; CH4: 9-14 abilitites to carry out scientific investigations; CH5: undderstanding of the nature of science btw 9-16 Masters & Forster – Researchers at ACER (Australian Council for Educational Research) – In 1997 created Developmental Assessment, ACER Assessment Resource Kit (ARK); Also did a lot of work on evaluating and benchmarking Australian students’ literacy and numeracy in 96-98. Roberts, Wilson, Draney – Created SEPUP (Science Education for Public Understanding Program) at Berkeley Year-long middle school science curriculum – Issues, Evidence, and You. Embedded assessment throughout the course; Included a component of progress maps to visualize where a student was and how their knowledge is progressing. Smith, et al - The purpose of this article is to suggest ways of using research on children's reasoning and learning to elaborate on existing national standards and to improve large-scale and classroom assessments. The authors suggest that learning progressions—descriptions of successively more sophisticated ways of reasoning within a content domain based on research syntheses and conceptual analyses—can be useful tools for using research on children's learning to improve assessments. Such learning progressions should be organized around central concepts and principles of a discipline (i.e., its big ideas) and show how those big ideas are elaborated, interrelated, and transformed with instruction. They should also specify how those big ideas are enacted in specific practices that allow students to use them in meaningful ways, enactments the authors describe as learning performances. Learning progressions thus can provide a basis for ongoing dialogue between science learning researchers and measurement specialists, leading to the development of assessments that use both standards documents and science learning research as resources and that will give teachers, curriculum developers, and policymakers more insight into students' scientific reasoning. The authors illustrate their argument by developing a learning progression for an important scientific topic—matter and atomic-molecular theory—and using it to generate sample learning performances and assessment items.
LEAH
LEAH
LEAH
LEAH
LEAH Technical issues of reliability and validity, fairness, consistency, and bias can quickly sink any attempt to measure along a progress variable as described above, or even to develop a reasonable framework that can be supported by evidence. To ensure comparability of results across time and context, procedures are needed to (a) examine the coherence of information gathered using different formats, (b) map student performances onto the progress variables, (c) describe the structural elements of the accountability system—tasks and raters—in terms of the achievement variables, and (d) establish uniform levels of system functioning, in terms of quality control indices such as reliability.