TRACY What the definitions share: Vertical development over an extended period of time Learning is envisioned as a development of progressive sophistication in understanding and skills within a domain No references to grade or age level expectations Learning is conceived as a sequence or continuum of increasing expertise
Martin Hughes – Progression in Learning – “about short-term and long-term changes in children’s knowledge and understanding which take place as they progress through various domains of learning. 5 studies – CH1: Penny Munn – progression in preschool conceptions of literacy and numeracy; CH2: progression btw 6-13 I understanding of two distinct areas of math and science CH 3: progression btw 7-14 in iddeas about historical enquiry and explanation; CH4: 9-14 abilitites to carry out scientific investigations; CH5: undderstanding of the nature of science btw 9-16 Masters & Forster – Researchers at ACER (Australian Council for Educational Research) – In 1997 created Developmental Assessment, ACER Assessment Resource Kit (ARK); Also did a lot of work on evaluating and benchmarking Australian students’ literacy and numeracy in 96-98. Roberts, Wilson, Draney – Created SEPUP (Science Education for Public Understanding Program) at Berkeley Year-long middle school science curriculum – Issues, Evidence, and You. Embedded assessment throughout the course; Included a component of progress maps to visualize where a student was and how their knowledge is progressing. Smith, et al - The purpose of this article is to suggest ways of using research on children's reasoning and learning to elaborate on existing national standards and to improve large-scale and classroom assessments. The authors suggest that learning progressions—descriptions of successively more sophisticated ways of reasoning within a content domain based on research syntheses and conceptual analyses—can be useful tools for using research on children's learning to improve assessments. Such learning progressions should be organized around central concepts and principles of a discipline (i.e., its big ideas) and show how those big ideas are elaborated, interrelated, and transformed with instruction. They should also specify how those big ideas are enacted in specific practices that allow students to use them in meaningful ways, enactments the authors describe as learning performances. Learning progressions thus can provide a basis for ongoing dialogue between science learning researchers and measurement specialists, leading to the development of assessments that use both standards documents and science learning research as resources and that will give teachers, curriculum developers, and policymakers more insight into students' scientific reasoning. The authors illustrate their argument by developing a learning progression for an important scientific topic—matter and atomic-molecular theory—and using it to generate sample learning performances and assessment items.
LEAH Technical issues of reliability and validity, fairness, consistency, and bias can quickly sink any attempt to measure along a progress variable as described above, or even to develop a reasonable framework that can be supported by evidence. To ensure comparability of results across time and context, procedures are needed to (a) examine the coherence of information gathered using different formats, (b) map student performances onto the progress variables, (c) describe the structural elements of the accountability system—tasks and raters—in terms of the achievement variables, and (d) establish uniform levels of system functioning, in terms of quality control indices such as reliability.
Learning Progressions Corey Simmons, Jennifer Collymore, Leah Bug & Tracy Thompson 6 April, 2010
Definition Descriptions of the successively more sophisticated ways of thinking about a topic that can follow one another as children learn about and investigate a topic over a broad span of time (e.g., 6 to 8 years). (Duschl et al, 2007) “… they lay out in words and examples what it means to move toward more expert understanding” (Wilson & Bertenthal, 2005)
Definition “… are empirically grounded and testable hypotheses about how students' understanding of, and ability to use, core scientific concepts and explanations and related scientific practices grow and become more sophisticated over time, with appropriate instruction” (Duschl et al, 2007).
Five Essential Characteristics 1.Clear end points defined by social aspirations and the central concepts and themes in the discipline 2. Progress variables that identify the critical dimensions of understanding and skill 3. Stages of progress that define significant intermediate steps in conceptual/skill development 4. Learning performances which are the operational definitions of what children's understanding and skills look like at each of these stages of progress 5. Assessments that measure student understanding of the key concepts or practices and track their developmental progress over time.
Billow of Clouds <ul><li>Each cloud is inherent of the skills, understandings and knowledge in the sequence in which they typically develop. The clouds are “building blocks” to achieving the learning goal, thus laying out the pathway to success. </li></ul><ul><li>Explicit learning progressions can provide the clarity that teachers need to achieve learning goals, both short term goals and long term goals. </li></ul>
What does this mean for teachers? <ul><li>By describing a pathway of learning, Learning Progressions can </li></ul><ul><li>assist teachers in: </li></ul><ul><li>Planning instruction </li></ul><ul><li>Tying assessment to learning goals and the evidence elicited can determine students’ understanding and skill at a given point </li></ul><ul><li>Curriculum development </li></ul><ul><li>When teachers understand the continuum of learning in a domain and have information about current status relative to learning goals, they are better able to make decisions about what the next steps in learning should be. </li></ul>
Benefits <ul><ul><li>Refine education policies, coherence and alignment </li></ul></ul><ul><ul><li>Improve: </li></ul></ul><ul><ul><ul><li>Standards </li></ul></ul></ul><ul><ul><ul><li>Assessment </li></ul></ul></ul><ul><ul><ul><li>Curricula </li></ul></ul></ul><ul><ul><ul><li>Instruction </li></ul></ul></ul><ul><ul><li>Greater accountability </li></ul></ul><ul><ul><li>Promote further research </li></ul></ul>
Origins of the Idea/Key Milestones 1997 2002 2003 2004 2005 2006 2007 1996 Research Policy / Publications Development of Progressions Hughes – Progression in Learning Masters & Forster Roberts, Wilson & Draney NCLB Commissioned Paper– Catley, Lehrer & Reiser: Evolution Wilson – Construct Maps NRC Report: Systems for State Science Assessment NRC Report: Taking Science to School Commissioned Paper – Smith, Wiser, Anderson & Krajcik: Atomic Molecular Theory CPRE launched CCII – move toward adaptive instruction
Key People Mark Wilson, University of California, Berkley James Pelligrino– University of Illinois - Chicago Leona Schauble & Richard Lehrer, Peabody Campus – Vanderbilt University Carol L. Smith, University of Massachusetts at Boston Tom Corcoran, Teachers College – Columbia University Alicia Alonzo, University of Iowa Joseph Krajcik, University of Michigan Richard J. Shavelson, Stanford University
Influential Articles Wilson, M (2009). Measuring Progressions: Assessment Structures Underlying a Learning Progression. Journal of Research in Science Teaching, (46) 6, 716-730 Mohan, L., Chen, J., Anderson C. (2009) Developing a multi-year learning progression for carbon cycling in socio-ecological systems. Journal of Research in Science Teaching, (46) 6, 675-698 Catley, K., Reiser, B., and Lehrer, R. (2005). Tracing a prospective learning progression for developing understanding of evolution. Commissioned paper prepared for the National Research Council’s Committee on Test Design for K–12 Science Achievement, Washington, DC. Smith, C., Wiser, M., Anderson, C., Krajcik, J. (2004) Implications of Research on Children’s Learning for Standards and Assessment: A Proposed Learning Progression for Matter and the Atomic-Molecular Theory . Commissioned paper prepared for the National Research Council’s Committee on Test Design for K–12 Science Achievement, Washington, DC. Wilson, M. & Draney, D. (2004). Some Links Between Large-Scale and Classroom Assessments: The case of the BEAR Assessment System. Towards Coherence Between Classroom Assessment and Accountability. NSSE Yearbook, 103, Part II
Influential Publications Corcoran, T., Mosher, F.A., & Rogat, A. (2009). Learning progressions in science: An evidence based approach to reform. NY: Center on Continuous Instructional Improvement, Teachers College—Columbia University National Research Council. (2005). Systems of State Science Assessments. Committee on Test Design for K-12 Science Achievement. M. R. Wilson & M. W. Bertenthal (Eds.). Washington, D. C.: National Academy Press. National Research Council. (2001). Knowing what students know: The science and design of educational assessment. Committee on the Foundations of Assessment. J. Pellegrino, N. Chudowsky, & R. Glaser (Eds). Washington, DC: National Academy Press.
Centers of Activity <ul><li>Berkley Evaluation and Assessment Research (BEAR) Center – 1994 </li></ul><ul><li>Mark Wilson </li></ul><ul><li>Consortium for Policy Research in Education (CPRE) – 1985 </li></ul><ul><li>Center on Continuous Instructional Improvement (CCII) A CPRE Center </li></ul><ul><ul><li>Co-PI: Tom Corcoran </li></ul></ul><ul><li>Learning Research and Development Center at the University of Pittsburgh - 1963 </li></ul><ul><li>Director:Charles Perfetti </li></ul>Stanford Education Assessment Laboratory (SEAL) Director: Richard J. Shavelson
Learning Progressions & The Assessment Triangle Observation Interpretation Cognition Item Design Construct Maps Outcome Space Measurement Model
Observation Interpretation Cognition Construct Maps (Specification of the Construct) The working definition of what is to be measured Learning Performances: suggest connections between the conceptual knowledge in the standards and related abilities and understandings that can be observed and assessed ‘‘ Developmental perspective’’ regarding student learning - assessing the development of student understanding of particular concepts and skills over time
Observation Interpretation Cognition All of the possible forms of items and tasks that can be used to elicit evidence about student knowledge and understanding embodied in the constructs. Item Design <ul><li>Example: Ordered Multiple Choice (OMC) questions </li></ul><ul><ul><li>Designed in such a way that each of the possible answer choices is linked to developmental levels of student understanding, facilitating the diagnostic interpretation of student responses </li></ul></ul>An item or task is useful if it elicits important evidence of the construct it is intended to measure. Groups of items or series of tasks should be assembled with a view to their collective ability to shed light on the full range of the content knowledge, understandings, and skills included in the construct as elaborated by the related learning performances.
Observation Interpretation Cognition Outcome Space Measurement Model Qualitatively different levels of responses to items and tasks associated with different levels of performance The basis on which assessors and users associate scores earned on items and tasks with particular levels of performance. The measurement model must relate the scored responses to the construct. Scoring guides for student responses to assessment tasks Example: Item Response Theory