SlideShare a Scribd company logo
1 of 17
Learning Progressions Corey Simmons, Jennifer Collymore, Leah Bug &  Tracy Thompson 6 April, 2010
Definition  Descriptions of the successively more sophisticated ways of thinking about a topic that can follow one another as children learn about and investigate a topic over a broad span of time (e.g., 6 to 8 years). (Duschl et al, 2007) “… they lay out in words and examples what it means to move toward more expert understanding” (Wilson & Bertenthal, 2005)
Definition  “… are empirically grounded and testable hypotheses about how students' understanding of, and ability to use, core scientific concepts and explanations and related scientific practices grow and become more sophisticated over time, with appropriate instruction” (Duschl et al, 2007).
Five Essential Characteristics 1.Clear end points defined by social aspirations and the central concepts and themes in the discipline   2. Progress variables that identify the critical dimensions of understanding and skill   3. Stages of progress that define significant intermediate steps in conceptual/skill development    4. Learning performances which are the operational definitions of what children's understanding and skills look like at each of these stages of progress   5. Assessments that measure student understanding of the key concepts or practices and track their developmental progress over time.  
Billow of Clouds ,[object Object],[object Object]
What does this mean for teachers? ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Benefits ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Origins of the Idea/Key Milestones 1997 2002 2003 2004 2005 2006 2007 1996 Research Policy / Publications Development of Progressions Hughes – Progression in Learning Masters & Forster Roberts, Wilson & Draney NCLB Commissioned Paper– Catley, Lehrer & Reiser:  Evolution Wilson – Construct Maps NRC Report: Systems for State Science Assessment NRC Report: Taking Science to School Commissioned Paper – Smith, Wiser, Anderson & Krajcik: Atomic Molecular Theory CPRE launched CCII – move toward adaptive instruction
Key People Mark Wilson, University of California, Berkley James Pelligrino– University of Illinois - Chicago Leona Schauble & Richard Lehrer, Peabody Campus – Vanderbilt University  Carol L. Smith, University of Massachusetts at Boston  Tom Corcoran, Teachers College – Columbia University Alicia Alonzo, University of Iowa Joseph Krajcik, University of Michigan Richard J. Shavelson, Stanford University
Influential Articles Wilson, M (2009).  Measuring Progressions: Assessment Structures Underlying a Learning Progression.  Journal of Research in Science Teaching, (46) 6, 716-730 Mohan, L., Chen, J., Anderson C. (2009)  Developing a multi-year learning progression for carbon cycling in socio-ecological systems.  Journal of Research in Science Teaching, (46) 6, 675-698 Catley, K., Reiser, B., and Lehrer, R. (2005).  Tracing a prospective learning progression for developing understanding of evolution.  Commissioned paper prepared for the National Research Council’s Committee on Test Design for K–12 Science Achievement, Washington, DC. Smith, C., Wiser, M., Anderson, C., Krajcik, J. (2004)  Implications of Research on Children’s Learning for Standards and Assessment: A Proposed Learning Progression for Matter and the Atomic-Molecular Theory .  Commissioned paper prepared for the National Research Council’s Committee on Test Design for K–12 Science Achievement, Washington, DC. Wilson, M. & Draney, D. (2004).  Some Links Between Large-Scale and Classroom Assessments: The case of the BEAR Assessment System. Towards Coherence Between Classroom Assessment and Accountability.  NSSE Yearbook, 103, Part II
Influential Publications Corcoran, T., Mosher, F.A., & Rogat, A. (2009).  Learning progressions in science: An evidence   based   approach to reform.  NY: Center on Continuous Instructional Improvement, Teachers College—Columbia University National Research Council. (2005).  Systems of State Science Assessments.  Committee on Test Design for K-12 Science Achievement.  M. R. Wilson & M. W. Bertenthal (Eds.). Washington, D. C.: National Academy Press.  National Research Council. (2001).  Knowing what students know: The science and design of educational assessment.  Committee on the Foundations of Assessment. J. Pellegrino, N. Chudowsky, & R. Glaser (Eds). Washington, DC: National Academy Press.
Centers of Activity ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Stanford Education Assessment Laboratory (SEAL) Director: Richard J. Shavelson
The Assessment Triangle
Learning Progressions & The Assessment Triangle Observation Interpretation Cognition Item Design Construct Maps Outcome Space Measurement Model
Observation Interpretation Cognition Construct Maps (Specification of the Construct) The working definition of what is to be measured Learning Performances:  suggest connections between the conceptual knowledge in the standards and related abilities and understandings that can be observed and assessed ‘‘ Developmental perspective’’  regarding student learning - assessing the development of student understanding of particular concepts and skills over time
Observation Interpretation Cognition All of the possible forms of items and tasks that can be used to elicit evidence about student knowledge and understanding embodied in the constructs. Item Design ,[object Object],[object Object],An item or task is useful if it elicits important evidence of the construct it is intended to measure. Groups of items or series of tasks should be assembled with a view to their collective ability to shed light on the full range of the content knowledge, understandings, and skills included in the construct as elaborated by the related learning performances.
Observation Interpretation Cognition Outcome Space Measurement Model Qualitatively different levels of responses to items and tasks associated with different levels of performance The basis on which assessors and users associate scores earned on items and tasks with particular levels of performance. The measurement model must relate the scored responses to the construct. Scoring guides for student responses to assessment tasks Example: Item Response Theory

More Related Content

What's hot

Industry 4.0 (IR4.0) & Teaching New Trends in Education with HOTS.
Industry 4.0 (IR4.0) & Teaching New Trends in Education with HOTS.Industry 4.0 (IR4.0) & Teaching New Trends in Education with HOTS.
Industry 4.0 (IR4.0) & Teaching New Trends in Education with HOTS.Timothy Wooi
 
Chapter 6 pragmatics - assessment
Chapter 6   pragmatics - assessmentChapter 6   pragmatics - assessment
Chapter 6 pragmatics - assessmentblantoncd
 
Teacher as the curriculum evaluator
Teacher as the curriculum evaluatorTeacher as the curriculum evaluator
Teacher as the curriculum evaluatorShin Tampus
 
Criteria for curriculum assessment
Criteria for curriculum assessmentCriteria for curriculum assessment
Criteria for curriculum assessmentCarl Richard Dagalea
 
How to write lesson objectives
How to write lesson objectivesHow to write lesson objectives
How to write lesson objectivesShiela Ann Neron
 
Globalization and-multicultural-literacies
Globalization and-multicultural-literaciesGlobalization and-multicultural-literacies
Globalization and-multicultural-literaciesJilliene Samantha Napil
 
Types of Authentic Assessment
Types of Authentic AssessmentTypes of Authentic Assessment
Types of Authentic AssessmentMarlin Dwinastiti
 
Understanding by design
Understanding by designUnderstanding by design
Understanding by designJoel Deuda
 
Declarative, Procedural, and Funtional Knowledge (BATLAG & BADONIO).pptx
Declarative, Procedural, and Funtional Knowledge (BATLAG & BADONIO).pptxDeclarative, Procedural, and Funtional Knowledge (BATLAG & BADONIO).pptx
Declarative, Procedural, and Funtional Knowledge (BATLAG & BADONIO).pptxAngelGriffinBatlagOm
 
Developing the curriculum chapter 7
Developing the curriculum chapter 7Developing the curriculum chapter 7
Developing the curriculum chapter 7GrigsbyB
 
Developing the curriculum chapter 10
Developing the curriculum chapter 10Developing the curriculum chapter 10
Developing the curriculum chapter 10GrigsbyB
 
Developing the curriculum chapter 2
Developing the curriculum chapter 2Developing the curriculum chapter 2
Developing the curriculum chapter 2GrigsbyB
 
WHEELER Cyclical Model of curriculum Process
WHEELER Cyclical Model of curriculum ProcessWHEELER Cyclical Model of curriculum Process
WHEELER Cyclical Model of curriculum ProcessInternational advisers
 
Characteristics of outcomes based assessment
Characteristics of outcomes based assessmentCharacteristics of outcomes based assessment
Characteristics of outcomes based assessmentWilliam Kapambwe
 
Reflection Paper on Enhancing Curriculum in Philippine Schools
Reflection Paper on Enhancing Curriculum in Philippine SchoolsReflection Paper on Enhancing Curriculum in Philippine Schools
Reflection Paper on Enhancing Curriculum in Philippine SchoolsJeanelei Carolino
 

What's hot (20)

Industry 4.0 (IR4.0) & Teaching New Trends in Education with HOTS.
Industry 4.0 (IR4.0) & Teaching New Trends in Education with HOTS.Industry 4.0 (IR4.0) & Teaching New Trends in Education with HOTS.
Industry 4.0 (IR4.0) & Teaching New Trends in Education with HOTS.
 
Creating Performance Tasks Easy Steps
Creating Performance Tasks Easy StepsCreating Performance Tasks Easy Steps
Creating Performance Tasks Easy Steps
 
Social Emotional Learning
Social Emotional LearningSocial Emotional Learning
Social Emotional Learning
 
Chapter 6 pragmatics - assessment
Chapter 6   pragmatics - assessmentChapter 6   pragmatics - assessment
Chapter 6 pragmatics - assessment
 
Teacher as the curriculum evaluator
Teacher as the curriculum evaluatorTeacher as the curriculum evaluator
Teacher as the curriculum evaluator
 
Criteria for curriculum assessment
Criteria for curriculum assessmentCriteria for curriculum assessment
Criteria for curriculum assessment
 
How to write lesson objectives
How to write lesson objectivesHow to write lesson objectives
How to write lesson objectives
 
Globalization and-multicultural-literacies
Globalization and-multicultural-literaciesGlobalization and-multicultural-literacies
Globalization and-multicultural-literacies
 
Types of Authentic Assessment
Types of Authentic AssessmentTypes of Authentic Assessment
Types of Authentic Assessment
 
Understanding by design
Understanding by designUnderstanding by design
Understanding by design
 
Declarative, Procedural, and Funtional Knowledge (BATLAG & BADONIO).pptx
Declarative, Procedural, and Funtional Knowledge (BATLAG & BADONIO).pptxDeclarative, Procedural, and Funtional Knowledge (BATLAG & BADONIO).pptx
Declarative, Procedural, and Funtional Knowledge (BATLAG & BADONIO).pptx
 
Mobile teaching program
Mobile teaching programMobile teaching program
Mobile teaching program
 
Developing the curriculum chapter 7
Developing the curriculum chapter 7Developing the curriculum chapter 7
Developing the curriculum chapter 7
 
Developing the curriculum chapter 10
Developing the curriculum chapter 10Developing the curriculum chapter 10
Developing the curriculum chapter 10
 
Educ.system.teacher kath12
Educ.system.teacher kath12Educ.system.teacher kath12
Educ.system.teacher kath12
 
Developing the curriculum chapter 2
Developing the curriculum chapter 2Developing the curriculum chapter 2
Developing the curriculum chapter 2
 
WHEELER Cyclical Model of curriculum Process
WHEELER Cyclical Model of curriculum ProcessWHEELER Cyclical Model of curriculum Process
WHEELER Cyclical Model of curriculum Process
 
Characteristics of outcomes based assessment
Characteristics of outcomes based assessmentCharacteristics of outcomes based assessment
Characteristics of outcomes based assessment
 
Reflection Paper on Enhancing Curriculum in Philippine Schools
Reflection Paper on Enhancing Curriculum in Philippine SchoolsReflection Paper on Enhancing Curriculum in Philippine Schools
Reflection Paper on Enhancing Curriculum in Philippine Schools
 
Final.docx
Final.docxFinal.docx
Final.docx
 

Similar to Learning Progressions

Ckass 2 determining a research question
Ckass 2   determining a research questionCkass 2   determining a research question
Ckass 2 determining a research questiontjcarter
 
PPTGROUP_3_PHYSICS TEACHING AND LEARNING.pdf
PPTGROUP_3_PHYSICS TEACHING AND LEARNING.pdfPPTGROUP_3_PHYSICS TEACHING AND LEARNING.pdf
PPTGROUP_3_PHYSICS TEACHING AND LEARNING.pdfAyuArrahmah
 
Utilizing Rubrics in Audio/Visual Production
Utilizing Rubrics in Audio/Visual ProductionUtilizing Rubrics in Audio/Visual Production
Utilizing Rubrics in Audio/Visual ProductionCorey Anderson
 
Models of curriculum design
Models of curriculum designModels of curriculum design
Models of curriculum designUnimasteressays
 
Assesssment based education, click open
Assesssment based education, click openAssesssment based education, click open
Assesssment based education, click openGholam-Reza Abbasian
 
A Three-Pronged Model To Learning Analysis And Instructional Design
A Three-Pronged Model To Learning Analysis And Instructional DesignA Three-Pronged Model To Learning Analysis And Instructional Design
A Three-Pronged Model To Learning Analysis And Instructional DesignCourtney Esco
 
Waddle Kam3application
Waddle Kam3applicationWaddle Kam3application
Waddle Kam3applicationAWaddle
 
Vol 16 No 7 - July 2017
Vol 16 No 7 - July 2017Vol 16 No 7 - July 2017
Vol 16 No 7 - July 2017ijlterorg
 
Journals for Learning, Journals for Teaching (LILAC 2009)
Journals for Learning, Journals for Teaching (LILAC 2009)Journals for Learning, Journals for Teaching (LILAC 2009)
Journals for Learning, Journals for Teaching (LILAC 2009)University College Dublin
 
Journals for learning, journals for teaching: using student accounts of the r...
Journals for learning, journals for teaching: using student accounts of the r...Journals for learning, journals for teaching: using student accounts of the r...
Journals for learning, journals for teaching: using student accounts of the r...IL Group (CILIP Information Literacy Group)
 
Learning Analytics and libraries
Learning Analytics and librariesLearning Analytics and libraries
Learning Analytics and librariesTalis
 
ESERA Paper Exploring teacher's belief Sally Howard
ESERA Paper Exploring teacher's belief Sally HowardESERA Paper Exploring teacher's belief Sally Howard
ESERA Paper Exploring teacher's belief Sally HowardSally Howard
 
Pauline Roberts_Reflection: Renewed focus for an existing problem in teacher ...
Pauline Roberts_Reflection: Renewed focus for an existing problem in teacher ...Pauline Roberts_Reflection: Renewed focus for an existing problem in teacher ...
Pauline Roberts_Reflection: Renewed focus for an existing problem in teacher ...Pauline K Roberts
 
Qualitative-Research-Questions-Grade12..
Qualitative-Research-Questions-Grade12..Qualitative-Research-Questions-Grade12..
Qualitative-Research-Questions-Grade12..PhilipHarveyBagalano
 
Literate environment analysis presentation power point
Literate environment analysis presentation power pointLiterate environment analysis presentation power point
Literate environment analysis presentation power pointKDavie
 
Vaughan, michelle connecting the dots nftej v27 n3 2017
Vaughan, michelle connecting the dots nftej v27 n3 2017Vaughan, michelle connecting the dots nftej v27 n3 2017
Vaughan, michelle connecting the dots nftej v27 n3 2017William Kritsonis
 
Assessment - A Powerful Lever For Learning
Assessment - A Powerful Lever For LearningAssessment - A Powerful Lever For Learning
Assessment - A Powerful Lever For LearningJim Webb
 
Assessing inquiry
Assessing inquiryAssessing inquiry
Assessing inquiryPetal James
 
An Act Of Translation The Need To Understand Students Understanding Of Crit...
An Act Of Translation  The Need To Understand Students  Understanding Of Crit...An Act Of Translation  The Need To Understand Students  Understanding Of Crit...
An Act Of Translation The Need To Understand Students Understanding Of Crit...Joaquin Hamad
 
Theorizing Digital Learning
Theorizing Digital LearningTheorizing Digital Learning
Theorizing Digital LearningMichael Wilder
 

Similar to Learning Progressions (20)

Ckass 2 determining a research question
Ckass 2   determining a research questionCkass 2   determining a research question
Ckass 2 determining a research question
 
PPTGROUP_3_PHYSICS TEACHING AND LEARNING.pdf
PPTGROUP_3_PHYSICS TEACHING AND LEARNING.pdfPPTGROUP_3_PHYSICS TEACHING AND LEARNING.pdf
PPTGROUP_3_PHYSICS TEACHING AND LEARNING.pdf
 
Utilizing Rubrics in Audio/Visual Production
Utilizing Rubrics in Audio/Visual ProductionUtilizing Rubrics in Audio/Visual Production
Utilizing Rubrics in Audio/Visual Production
 
Models of curriculum design
Models of curriculum designModels of curriculum design
Models of curriculum design
 
Assesssment based education, click open
Assesssment based education, click openAssesssment based education, click open
Assesssment based education, click open
 
A Three-Pronged Model To Learning Analysis And Instructional Design
A Three-Pronged Model To Learning Analysis And Instructional DesignA Three-Pronged Model To Learning Analysis And Instructional Design
A Three-Pronged Model To Learning Analysis And Instructional Design
 
Waddle Kam3application
Waddle Kam3applicationWaddle Kam3application
Waddle Kam3application
 
Vol 16 No 7 - July 2017
Vol 16 No 7 - July 2017Vol 16 No 7 - July 2017
Vol 16 No 7 - July 2017
 
Journals for Learning, Journals for Teaching (LILAC 2009)
Journals for Learning, Journals for Teaching (LILAC 2009)Journals for Learning, Journals for Teaching (LILAC 2009)
Journals for Learning, Journals for Teaching (LILAC 2009)
 
Journals for learning, journals for teaching: using student accounts of the r...
Journals for learning, journals for teaching: using student accounts of the r...Journals for learning, journals for teaching: using student accounts of the r...
Journals for learning, journals for teaching: using student accounts of the r...
 
Learning Analytics and libraries
Learning Analytics and librariesLearning Analytics and libraries
Learning Analytics and libraries
 
ESERA Paper Exploring teacher's belief Sally Howard
ESERA Paper Exploring teacher's belief Sally HowardESERA Paper Exploring teacher's belief Sally Howard
ESERA Paper Exploring teacher's belief Sally Howard
 
Pauline Roberts_Reflection: Renewed focus for an existing problem in teacher ...
Pauline Roberts_Reflection: Renewed focus for an existing problem in teacher ...Pauline Roberts_Reflection: Renewed focus for an existing problem in teacher ...
Pauline Roberts_Reflection: Renewed focus for an existing problem in teacher ...
 
Qualitative-Research-Questions-Grade12..
Qualitative-Research-Questions-Grade12..Qualitative-Research-Questions-Grade12..
Qualitative-Research-Questions-Grade12..
 
Literate environment analysis presentation power point
Literate environment analysis presentation power pointLiterate environment analysis presentation power point
Literate environment analysis presentation power point
 
Vaughan, michelle connecting the dots nftej v27 n3 2017
Vaughan, michelle connecting the dots nftej v27 n3 2017Vaughan, michelle connecting the dots nftej v27 n3 2017
Vaughan, michelle connecting the dots nftej v27 n3 2017
 
Assessment - A Powerful Lever For Learning
Assessment - A Powerful Lever For LearningAssessment - A Powerful Lever For Learning
Assessment - A Powerful Lever For Learning
 
Assessing inquiry
Assessing inquiryAssessing inquiry
Assessing inquiry
 
An Act Of Translation The Need To Understand Students Understanding Of Crit...
An Act Of Translation  The Need To Understand Students  Understanding Of Crit...An Act Of Translation  The Need To Understand Students  Understanding Of Crit...
An Act Of Translation The Need To Understand Students Understanding Of Crit...
 
Theorizing Digital Learning
Theorizing Digital LearningTheorizing Digital Learning
Theorizing Digital Learning
 

Learning Progressions

  • 1. Learning Progressions Corey Simmons, Jennifer Collymore, Leah Bug & Tracy Thompson 6 April, 2010
  • 2. Definition Descriptions of the successively more sophisticated ways of thinking about a topic that can follow one another as children learn about and investigate a topic over a broad span of time (e.g., 6 to 8 years). (Duschl et al, 2007) “… they lay out in words and examples what it means to move toward more expert understanding” (Wilson & Bertenthal, 2005)
  • 3. Definition “… are empirically grounded and testable hypotheses about how students' understanding of, and ability to use, core scientific concepts and explanations and related scientific practices grow and become more sophisticated over time, with appropriate instruction” (Duschl et al, 2007).
  • 4. Five Essential Characteristics 1.Clear end points defined by social aspirations and the central concepts and themes in the discipline   2. Progress variables that identify the critical dimensions of understanding and skill   3. Stages of progress that define significant intermediate steps in conceptual/skill development    4. Learning performances which are the operational definitions of what children's understanding and skills look like at each of these stages of progress   5. Assessments that measure student understanding of the key concepts or practices and track their developmental progress over time.  
  • 5.
  • 6.
  • 7.
  • 8. Origins of the Idea/Key Milestones 1997 2002 2003 2004 2005 2006 2007 1996 Research Policy / Publications Development of Progressions Hughes – Progression in Learning Masters & Forster Roberts, Wilson & Draney NCLB Commissioned Paper– Catley, Lehrer & Reiser: Evolution Wilson – Construct Maps NRC Report: Systems for State Science Assessment NRC Report: Taking Science to School Commissioned Paper – Smith, Wiser, Anderson & Krajcik: Atomic Molecular Theory CPRE launched CCII – move toward adaptive instruction
  • 9. Key People Mark Wilson, University of California, Berkley James Pelligrino– University of Illinois - Chicago Leona Schauble & Richard Lehrer, Peabody Campus – Vanderbilt University Carol L. Smith, University of Massachusetts at Boston Tom Corcoran, Teachers College – Columbia University Alicia Alonzo, University of Iowa Joseph Krajcik, University of Michigan Richard J. Shavelson, Stanford University
  • 10. Influential Articles Wilson, M (2009). Measuring Progressions: Assessment Structures Underlying a Learning Progression. Journal of Research in Science Teaching, (46) 6, 716-730 Mohan, L., Chen, J., Anderson C. (2009) Developing a multi-year learning progression for carbon cycling in socio-ecological systems.  Journal of Research in Science Teaching, (46) 6, 675-698 Catley, K., Reiser, B., and Lehrer, R. (2005). Tracing a prospective learning progression for developing understanding of evolution. Commissioned paper prepared for the National Research Council’s Committee on Test Design for K–12 Science Achievement, Washington, DC. Smith, C., Wiser, M., Anderson, C., Krajcik, J. (2004) Implications of Research on Children’s Learning for Standards and Assessment: A Proposed Learning Progression for Matter and the Atomic-Molecular Theory . Commissioned paper prepared for the National Research Council’s Committee on Test Design for K–12 Science Achievement, Washington, DC. Wilson, M. & Draney, D. (2004). Some Links Between Large-Scale and Classroom Assessments: The case of the BEAR Assessment System. Towards Coherence Between Classroom Assessment and Accountability. NSSE Yearbook, 103, Part II
  • 11. Influential Publications Corcoran, T., Mosher, F.A., & Rogat, A. (2009). Learning progressions in science: An evidence based approach to reform. NY: Center on Continuous Instructional Improvement, Teachers College—Columbia University National Research Council. (2005). Systems of State Science Assessments.  Committee on Test Design for K-12 Science Achievement. M. R. Wilson & M. W. Bertenthal (Eds.). Washington, D. C.: National Academy Press. National Research Council. (2001). Knowing what students know: The science and design of educational assessment. Committee on the Foundations of Assessment. J. Pellegrino, N. Chudowsky, & R. Glaser (Eds). Washington, DC: National Academy Press.
  • 12.
  • 14. Learning Progressions & The Assessment Triangle Observation Interpretation Cognition Item Design Construct Maps Outcome Space Measurement Model
  • 15. Observation Interpretation Cognition Construct Maps (Specification of the Construct) The working definition of what is to be measured Learning Performances: suggest connections between the conceptual knowledge in the standards and related abilities and understandings that can be observed and assessed ‘‘ Developmental perspective’’ regarding student learning - assessing the development of student understanding of particular concepts and skills over time
  • 16.
  • 17. Observation Interpretation Cognition Outcome Space Measurement Model Qualitatively different levels of responses to items and tasks associated with different levels of performance The basis on which assessors and users associate scores earned on items and tasks with particular levels of performance. The measurement model must relate the scored responses to the construct. Scoring guides for student responses to assessment tasks Example: Item Response Theory

Editor's Notes

  1. TRACY
  2. TRACY What the definitions share: Vertical development over an extended period of time Learning is envisioned as a development of progressive sophistication in understanding and skills within a domain No references to grade or age level expectations Learning is conceived as a sequence or continuum of increasing expertise
  3. COREY
  4. COREY
  5. COREY
  6. Martin Hughes – Progression in Learning – “about short-term and long-term changes in children’s knowledge and understanding which take place as they progress through various domains of learning. 5 studies – CH1: Penny Munn – progression in preschool conceptions of literacy and numeracy; CH2: progression btw 6-13 I understanding of two distinct areas of math and science CH 3: progression btw 7-14 in iddeas about historical enquiry and explanation; CH4: 9-14 abilitites to carry out scientific investigations; CH5: undderstanding of the nature of science btw 9-16 Masters & Forster – Researchers at ACER (Australian Council for Educational Research) – In 1997 created Developmental Assessment, ACER Assessment Resource Kit (ARK); Also did a lot of work on evaluating and benchmarking Australian students’ literacy and numeracy in 96-98. Roberts, Wilson, Draney – Created SEPUP (Science Education for Public Understanding Program) at Berkeley Year-long middle school science curriculum – Issues, Evidence, and You. Embedded assessment throughout the course; Included a component of progress maps to visualize where a student was and how their knowledge is progressing. Smith, et al - The purpose of this article is to suggest ways of using research on children's reasoning and learning to elaborate on existing national standards and to improve large-scale and classroom assessments. The authors suggest that learning progressions—descriptions of successively more sophisticated ways of reasoning within a content domain based on research syntheses and conceptual analyses—can be useful tools for using research on children's learning to improve assessments. Such learning progressions should be organized around central concepts and principles of a discipline (i.e., its big ideas) and show how those big ideas are elaborated, interrelated, and transformed with instruction. They should also specify how those big ideas are enacted in specific practices that allow students to use them in meaningful ways, enactments the authors describe as learning performances. Learning progressions thus can provide a basis for ongoing dialogue between science learning researchers and measurement specialists, leading to the development of assessments that use both standards documents and science learning research as resources and that will give teachers, curriculum developers, and policymakers more insight into students' scientific reasoning. The authors illustrate their argument by developing a learning progression for an important scientific topic—matter and atomic-molecular theory—and using it to generate sample learning performances and assessment items.
  7. LEAH
  8. LEAH
  9. LEAH
  10. LEAH
  11. LEAH Technical issues of reliability and validity, fairness, consistency, and bias can quickly sink any attempt to measure along a progress variable as described above, or even to develop a reasonable framework that can be supported by evidence. To ensure comparability of results across time and context, procedures are needed to (a) examine the coherence of information gathered using different formats, (b) map student performances onto the progress variables, (c) describe the structural elements of the accountability system—tasks and raters—in terms of the achievement variables, and (d) establish uniform levels of system functioning, in terms of quality control indices such as reliability.