SlideShare a Scribd company logo
1 of 2
Download to read offline
CAPSTONES
                  Rubric for Assessing the Use of Capstone Experiences for Assessing Program Learning Outcomes
   Criterion                  Initial                           Emerging                                Developed                           Highly Developed
Relevant        It is not clear which program      The relevant outcomes are                Relevant outcomes are                Relevant evidence is collected; faculty
Outcomes        outcomes will be assessed          identified, e.g., ability to integrate   identified. Concrete plans for       have agreed on explicit criteria
and Lines of    in the capstone course.            knowledge to solve complex               collecting evidence for each         statements, e.g., rubrics, and have
Evidence                                           problems; however, concrete              outcome are agreed upon and          identified examples of student
Identified                                         plans for collecting evidence for        used routinely by faculty who        performance at varying levels of
                                                   each outcome have not been               staff the capstone course.           mastery for each relevant outcome.
                                                   developed.
Valid Results   It is not clear that potentially   Faculty have reached general             Faculty have agreed on concrete      Assessment criteria, such as rubrics,
                valid evidence for each            agreement on the types of                plans for collecting relevant        have been pilot-tested and refined
                relevant outcome is                evidence to be collected for each        evidence for each outcome.           over time; they usually are shared with
                collected and/or individual        outcome; they have discussed             Explicit criteria, e.g., rubrics,    students. Feedback from external
                faculty use idiosyncratic          relevant criteria for assessing          have been developed to assess        reviewers has led to refinements in the
                criteria to assess student         each outcome but these are not           the level of student attainment of   assessment process, and the
                work or performances.              yet fully defined.                       each outcome.                        department uses external
                                                                                                                                 benchmarking data.
Reliable        Those who review student           Reviewers are calibrated to apply        Reviewers are calibrated to apply    Reviewers are calibrated, and faculty
Results         work are not calibrated to         assessment criteria in the same          assessment criteria in the same      routinely find assessment data have
                apply assessment criteria in       way or faculty routinely check for       way, and faculty routinely check     high inter-rater reliability.
                the same way; there are no         inter-rater reliability.                 for inter-rater reliability.
                checks for inter-rater
                reliability.
Results Are     Results for each outcome           Results for each outcome are             Results for each outcome are         Faculty routinely discuss results, plan
Used            may or may not be are              collected and may be discussed           collected, discussed by faculty,     needed changes, secure necessary
                collected. They are not            by the faculty, but results have         analyzed, and used to improve        resources, and implement changes.
                discussed among faculty.           not been used to improve the             the program.                         They may collaborate with others,
                                                   program.                                                                      such as librarians or Student Affairs
                                                                                                                                 professionals, to improve results.
                                                                                                                                 Follow-up studies confirm that
                                                                                                                                 changes have improved learning.
The Student     Students know little or            Students have some knowledge             Students have a good grasp of        Students are well-acquainted with
Experience      nothing about the purpose of       of the purpose and outcomes of           purpose and outcomes of the          purpose and outcomes of the
                the capstone or outcomes to        the capstone. Communication is           capstone and embrace it as a         capstone and embrace it. They may
                be assessed. It is just            occasional, informal, left to            learning opportunity. Information    participate in refining the experience,
                another course or                  individual faculty or advisors.          is readily avail-able in advising    outcomes, and rubrics. Information is
                requirement.                                                                guides, etc.                         readily available.
How Visiting Team Members Can Use the Capstone Rubric
Conclusions should be based on discussion with relevant department members (e.g., chair, assessment coordinator, faculty). A variety of capstone
experiences can be used to collect assessment data, such as:
• courses, such as senior seminars, in which advanced students are required to consider the discipline broadly and integrate what they have learned
     in the curriculum
• specialized, advanced courses
• advanced-level projects conducted under the guidance of a faculty member or committee, such as research projects, theses, or dissertations
• advanced-level internships or practica, e.g., at the end of an MBA program
Assessment data for a variety of outcomes can be collected in such courses, particularly outcomes related to integrating and applying the discipline,
information literacy, critical thinking, and research and communication skills.
The rubric has five major dimensions:
1. Relevant Outcomes and Evidence Identified. It is likely that not all program learning outcomes can be assessed within a single capstone course or
     experience. Questions: Have faculty explicitly determined which program outcomes will be assessed in the capstone? Have they agreed on concrete
     plans for collecting evidence relevant to each targeted outcome? Have they agreed on explicit criteria, such as rubrics, for assessing the evidence?
     Have they identified examples of student performance for each outcome at varying performance levels (e.g., below expectations, meeting, exceeding
     expectations for graduation)?
2. Valid Results. A valid assessment of a particular outcome leads to accurate conclusions concerning students’ achievement of that outcome.
     Sometimes faculty collect evidence that does not have the potential to provide valid conclusions. For example, a multiple-choice test will not provide
     evidence of students’ ability to deliver effective oral presentations. Assessment requires the collection of valid evidence and judgments about that
     evidence that are based on well-established, agreed-upon criteria that specify how to identify low, medium, or high-quality work. Questions: Are
     faculty collecting valid evidence for each targeted outcome? Are they using well-established, agreed-upon criteria, such as rubrics, for assessing the
     evidence for each outcome? Have faculty pilot tested and refined their process based on experience and feedback from external reviewers? Are they
     sharing the criteria with their students? Are they using benchmarking (comparison) data?
3. Reliable Results. Well-qualified judges should reach the same conclusions about individual student’s achievement of a learning outcome,
     demonstrating inter-rater reliability. If two judges independently assess a set of materials, their ratings can be correlated. Sometimes a discrepancy
     index is used. How often do the two raters give identical ratings, ratings one point apart, ratings two points apart, etc.? Data are reliable if the
     correlation is high and/or if the discrepancies are small. Raters generally are calibrated (“normed”) to increase reliability. Calibration usually involves
     a training session in which raters apply rubrics to pre-selected examples of student work that vary in quality, then reach consensus about the rating
     each example should receive. The purpose is to ensure that all raters apply the criteria in the same way so that each student’s product receives the
     same score, regardless of rater. Questions: Are reviewers calibrated? Are checks for inter-rater reliability made? Is there evidence of high inter-rater
     reliability?
4. Results Are Used. Assessment is a process designed to monitor and improve learning, so assessment findings should have an impact. Faculty
     should reflect on results for each outcome and decide if they are acceptable or disappointing. If results do not meet faculty standards, faculty should
     determine which changes should be made, e.g., in pedagogy, curriculum, student support, or faculty support. Questions: Do faculty collect
     assessment results, discuss them, and reach conclusions about student achievement? Do they develop explicit plans to improve student learning?
     Do they implement those plans? Do they have a history of securing necessary resources to support this implementation? Do they collaborate with
     other campus professionals to improve student learning? Do follow-up studies confirm that changes have improved learning?
The Student Experience. Students should understand the purposes different educational experiences serve in promoting their learning and
development and know how to take advantage of them; ideally they should also participate in shaping those experiences. Thus it is essential to
communicate to students consistently and include them meaningfully. Questions: Are purposes and outcomes communicated to students? Do they
understand how capstones support learning? Do they participate in reviews of the capstone experience, its outcomes, criteria, or related activities?

More Related Content

What's hot

Evaluation and Impact Measurement - Imperial College London
Evaluation and Impact Measurement - Imperial College LondonEvaluation and Impact Measurement - Imperial College London
Evaluation and Impact Measurement - Imperial College London
Brightside
 
8. Evaluation of Student Performance
8. Evaluation of Student Performance8. Evaluation of Student Performance
8. Evaluation of Student Performance
ISEAL Alliance
 
Design Chapter 7 - Testing and Evaluation Techniques
Design Chapter 7 - Testing and Evaluation TechniquesDesign Chapter 7 - Testing and Evaluation Techniques
Design Chapter 7 - Testing and Evaluation Techniques
guest01bdf1
 

What's hot (19)

Purposes and uses of evaluation
Purposes and uses of evaluationPurposes and uses of evaluation
Purposes and uses of evaluation
 
Evaluation and Impact Measurement - Imperial College London
Evaluation and Impact Measurement - Imperial College LondonEvaluation and Impact Measurement - Imperial College London
Evaluation and Impact Measurement - Imperial College London
 
Authentic assessment-in-the-classroom
Authentic assessment-in-the-classroomAuthentic assessment-in-the-classroom
Authentic assessment-in-the-classroom
 
Feedback on summative assessment group pres
Feedback on summative assessment group presFeedback on summative assessment group pres
Feedback on summative assessment group pres
 
8. Evaluation of Student Performance
8. Evaluation of Student Performance8. Evaluation of Student Performance
8. Evaluation of Student Performance
 
Evaluation Principles: Theory-Based, Utilization-Focused, Participatory
Evaluation Principles: Theory-Based, Utilization-Focused, ParticipatoryEvaluation Principles: Theory-Based, Utilization-Focused, Participatory
Evaluation Principles: Theory-Based, Utilization-Focused, Participatory
 
The importance of performance evaluation
The importance of performance evaluationThe importance of performance evaluation
The importance of performance evaluation
 
Assessment of learning
Assessment of learningAssessment of learning
Assessment of learning
 
Classroom Diagnostic Tool
Classroom Diagnostic ToolClassroom Diagnostic Tool
Classroom Diagnostic Tool
 
CET Unit 8 Assessment
CET Unit 8 AssessmentCET Unit 8 Assessment
CET Unit 8 Assessment
 
Design Chapter 7 - Testing and Evaluation Techniques
Design Chapter 7 - Testing and Evaluation TechniquesDesign Chapter 7 - Testing and Evaluation Techniques
Design Chapter 7 - Testing and Evaluation Techniques
 
3. instruments used in evaluation
3. instruments used in evaluation3. instruments used in evaluation
3. instruments used in evaluation
 
Conducting Programme Evaluation
Conducting Programme EvaluationConducting Programme Evaluation
Conducting Programme Evaluation
 
Evaluation – concepts and principles
Evaluation – concepts and principlesEvaluation – concepts and principles
Evaluation – concepts and principles
 
Basic Principles of Assessment
Basic Principles of AssessmentBasic Principles of Assessment
Basic Principles of Assessment
 
Tools and Techniques for Assessment
Tools and Techniques for AssessmentTools and Techniques for Assessment
Tools and Techniques for Assessment
 
Sw5 literacy hub
Sw5 literacy hubSw5 literacy hub
Sw5 literacy hub
 
Test Testing and Evaluation
Test Testing and EvaluationTest Testing and Evaluation
Test Testing and Evaluation
 
Professional education reviewer for let or blept examinees
Professional education reviewer for let or blept examineesProfessional education reviewer for let or blept examinees
Professional education reviewer for let or blept examinees
 

Viewers also liked (6)

Individual Student Profile
Individual Student ProfileIndividual Student Profile
Individual Student Profile
 
Stephanie Herrera-Module 4 Application Assignment
Stephanie Herrera-Module 4 Application AssignmentStephanie Herrera-Module 4 Application Assignment
Stephanie Herrera-Module 4 Application Assignment
 
Module 4 Application- Stephanie Herrera
Module 4 Application- Stephanie HerreraModule 4 Application- Stephanie Herrera
Module 4 Application- Stephanie Herrera
 
Naci individual student profile
Naci individual student profileNaci individual student profile
Naci individual student profile
 
Capstone Paper
Capstone PaperCapstone Paper
Capstone Paper
 
Herrera Stephanie Module 3 Assignment
Herrera Stephanie Module 3 AssignmentHerrera Stephanie Module 3 Assignment
Herrera Stephanie Module 3 Assignment
 

Similar to Capstone Rubric4 08

Summative and formative evaluation
Summative and formative evaluationSummative and formative evaluation
Summative and formative evaluation
King Ayapana
 
Pamantasan ng lungsod ng valenzuela bsed f il 3-1 2013 5
Pamantasan ng lungsod ng valenzuela  bsed f il 3-1 2013 5Pamantasan ng lungsod ng valenzuela  bsed f il 3-1 2013 5
Pamantasan ng lungsod ng valenzuela bsed f il 3-1 2013 5
King Ayapana
 
assessment 1 compilation
assessment 1 compilationassessment 1 compilation
assessment 1 compilation
Ysa Garcera
 
Evaluacón ILE Introduction to the course - Preliminars
Evaluacón ILE Introduction to the course - PreliminarsEvaluacón ILE Introduction to the course - Preliminars
Evaluacón ILE Introduction to the course - Preliminars
mirquint2002
 

Similar to Capstone Rubric4 08 (20)

Differentiated Supervision – Because Students Are Not the Only Ones with Diff...
Differentiated Supervision – Because Students Are Not the Only Ones with Diff...Differentiated Supervision – Because Students Are Not the Only Ones with Diff...
Differentiated Supervision – Because Students Are Not the Only Ones with Diff...
 
Summative and formative evaluation
Summative and formative evaluationSummative and formative evaluation
Summative and formative evaluation
 
Pamantasan ng lungsod ng valenzuela bsed f il 3-1 2013 5
Pamantasan ng lungsod ng valenzuela  bsed f il 3-1 2013 5Pamantasan ng lungsod ng valenzuela  bsed f il 3-1 2013 5
Pamantasan ng lungsod ng valenzuela bsed f il 3-1 2013 5
 
Asha technology ppt
Asha technology pptAsha technology ppt
Asha technology ppt
 
Test, measurement, assessment & evaluation
Test, measurement, assessment & evaluationTest, measurement, assessment & evaluation
Test, measurement, assessment & evaluation
 
teaching and learning process (evaluation)
teaching and learning process (evaluation)teaching and learning process (evaluation)
teaching and learning process (evaluation)
 
Evaluation in Teaching learning process
Evaluation in Teaching learning processEvaluation in Teaching learning process
Evaluation in Teaching learning process
 
assessment 1 compilation
assessment 1 compilationassessment 1 compilation
assessment 1 compilation
 
M2 PERFORMANCE BASED ASSESSMENT.pdf
M2 PERFORMANCE BASED ASSESSMENT.pdfM2 PERFORMANCE BASED ASSESSMENT.pdf
M2 PERFORMANCE BASED ASSESSMENT.pdf
 
Evaluacón ILE Introduction to the course - Preliminars
Evaluacón ILE Introduction to the course - PreliminarsEvaluacón ILE Introduction to the course - Preliminars
Evaluacón ILE Introduction to the course - Preliminars
 
Note on evaluation and assessment Part - 1
Note on evaluation and assessment Part - 1Note on evaluation and assessment Part - 1
Note on evaluation and assessment Part - 1
 
Basics of assessment
Basics of assessmentBasics of assessment
Basics of assessment
 
Curriculum (formative & summative) evaluation
Curriculum (formative & summative) evaluationCurriculum (formative & summative) evaluation
Curriculum (formative & summative) evaluation
 
Assessment
AssessmentAssessment
Assessment
 
Assessment
AssessmentAssessment
Assessment
 
Types of Evaluation prior to Instructional Act
Types of Evaluation prior to Instructional ActTypes of Evaluation prior to Instructional Act
Types of Evaluation prior to Instructional Act
 
Note on Evaluation and Assessment in Nursing Education (Part - 01)
Note on Evaluation and Assessment in Nursing Education (Part - 01)Note on Evaluation and Assessment in Nursing Education (Part - 01)
Note on Evaluation and Assessment in Nursing Education (Part - 01)
 
Test Development and Evaluation
Test Development and Evaluation Test Development and Evaluation
Test Development and Evaluation
 
Assessment of student learning 1
Assessment of student learning 1Assessment of student learning 1
Assessment of student learning 1
 
Summative evaluation
Summative evaluationSummative evaluation
Summative evaluation
 

Recently uploaded

QUATER-1-PE-HEALTH-LC2- this is just a sample of unpacked lesson
QUATER-1-PE-HEALTH-LC2- this is just a sample of unpacked lessonQUATER-1-PE-HEALTH-LC2- this is just a sample of unpacked lesson
QUATER-1-PE-HEALTH-LC2- this is just a sample of unpacked lesson
httgc7rh9c
 

Recently uploaded (20)

Simple, Complex, and Compound Sentences Exercises.pdf
Simple, Complex, and Compound Sentences Exercises.pdfSimple, Complex, and Compound Sentences Exercises.pdf
Simple, Complex, and Compound Sentences Exercises.pdf
 
How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17
 
Introduction to TechSoup’s Digital Marketing Services and Use Cases
Introduction to TechSoup’s Digital Marketing  Services and Use CasesIntroduction to TechSoup’s Digital Marketing  Services and Use Cases
Introduction to TechSoup’s Digital Marketing Services and Use Cases
 
PANDITA RAMABAI- Indian political thought GENDER.pptx
PANDITA RAMABAI- Indian political thought GENDER.pptxPANDITA RAMABAI- Indian political thought GENDER.pptx
PANDITA RAMABAI- Indian political thought GENDER.pptx
 
Graduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - EnglishGraduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - English
 
Economic Importance Of Fungi In Food Additives
Economic Importance Of Fungi In Food AdditivesEconomic Importance Of Fungi In Food Additives
Economic Importance Of Fungi In Food Additives
 
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptxOn_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
 
How to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptxHow to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptx
 
AIM of Education-Teachers Training-2024.ppt
AIM of Education-Teachers Training-2024.pptAIM of Education-Teachers Training-2024.ppt
AIM of Education-Teachers Training-2024.ppt
 
OS-operating systems- ch05 (CPU Scheduling) ...
OS-operating systems- ch05 (CPU Scheduling) ...OS-operating systems- ch05 (CPU Scheduling) ...
OS-operating systems- ch05 (CPU Scheduling) ...
 
How to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POSHow to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POS
 
What is 3 Way Matching Process in Odoo 17.pptx
What is 3 Way Matching Process in Odoo 17.pptxWhat is 3 Way Matching Process in Odoo 17.pptx
What is 3 Way Matching Process in Odoo 17.pptx
 
Wellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptxWellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptx
 
QUATER-1-PE-HEALTH-LC2- this is just a sample of unpacked lesson
QUATER-1-PE-HEALTH-LC2- this is just a sample of unpacked lessonQUATER-1-PE-HEALTH-LC2- this is just a sample of unpacked lesson
QUATER-1-PE-HEALTH-LC2- this is just a sample of unpacked lesson
 
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptxHMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
 
FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024
 
dusjagr & nano talk on open tools for agriculture research and learning
dusjagr & nano talk on open tools for agriculture research and learningdusjagr & nano talk on open tools for agriculture research and learning
dusjagr & nano talk on open tools for agriculture research and learning
 
How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17
 

Capstone Rubric4 08

  • 1. CAPSTONES Rubric for Assessing the Use of Capstone Experiences for Assessing Program Learning Outcomes Criterion Initial Emerging Developed Highly Developed Relevant It is not clear which program The relevant outcomes are Relevant outcomes are Relevant evidence is collected; faculty Outcomes outcomes will be assessed identified, e.g., ability to integrate identified. Concrete plans for have agreed on explicit criteria and Lines of in the capstone course. knowledge to solve complex collecting evidence for each statements, e.g., rubrics, and have Evidence problems; however, concrete outcome are agreed upon and identified examples of student Identified plans for collecting evidence for used routinely by faculty who performance at varying levels of each outcome have not been staff the capstone course. mastery for each relevant outcome. developed. Valid Results It is not clear that potentially Faculty have reached general Faculty have agreed on concrete Assessment criteria, such as rubrics, valid evidence for each agreement on the types of plans for collecting relevant have been pilot-tested and refined relevant outcome is evidence to be collected for each evidence for each outcome. over time; they usually are shared with collected and/or individual outcome; they have discussed Explicit criteria, e.g., rubrics, students. Feedback from external faculty use idiosyncratic relevant criteria for assessing have been developed to assess reviewers has led to refinements in the criteria to assess student each outcome but these are not the level of student attainment of assessment process, and the work or performances. yet fully defined. each outcome. department uses external benchmarking data. Reliable Those who review student Reviewers are calibrated to apply Reviewers are calibrated to apply Reviewers are calibrated, and faculty Results work are not calibrated to assessment criteria in the same assessment criteria in the same routinely find assessment data have apply assessment criteria in way or faculty routinely check for way, and faculty routinely check high inter-rater reliability. the same way; there are no inter-rater reliability. for inter-rater reliability. checks for inter-rater reliability. Results Are Results for each outcome Results for each outcome are Results for each outcome are Faculty routinely discuss results, plan Used may or may not be are collected and may be discussed collected, discussed by faculty, needed changes, secure necessary collected. They are not by the faculty, but results have analyzed, and used to improve resources, and implement changes. discussed among faculty. not been used to improve the the program. They may collaborate with others, program. such as librarians or Student Affairs professionals, to improve results. Follow-up studies confirm that changes have improved learning. The Student Students know little or Students have some knowledge Students have a good grasp of Students are well-acquainted with Experience nothing about the purpose of of the purpose and outcomes of purpose and outcomes of the purpose and outcomes of the the capstone or outcomes to the capstone. Communication is capstone and embrace it as a capstone and embrace it. They may be assessed. It is just occasional, informal, left to learning opportunity. Information participate in refining the experience, another course or individual faculty or advisors. is readily avail-able in advising outcomes, and rubrics. Information is requirement. guides, etc. readily available.
  • 2. How Visiting Team Members Can Use the Capstone Rubric Conclusions should be based on discussion with relevant department members (e.g., chair, assessment coordinator, faculty). A variety of capstone experiences can be used to collect assessment data, such as: • courses, such as senior seminars, in which advanced students are required to consider the discipline broadly and integrate what they have learned in the curriculum • specialized, advanced courses • advanced-level projects conducted under the guidance of a faculty member or committee, such as research projects, theses, or dissertations • advanced-level internships or practica, e.g., at the end of an MBA program Assessment data for a variety of outcomes can be collected in such courses, particularly outcomes related to integrating and applying the discipline, information literacy, critical thinking, and research and communication skills. The rubric has five major dimensions: 1. Relevant Outcomes and Evidence Identified. It is likely that not all program learning outcomes can be assessed within a single capstone course or experience. Questions: Have faculty explicitly determined which program outcomes will be assessed in the capstone? Have they agreed on concrete plans for collecting evidence relevant to each targeted outcome? Have they agreed on explicit criteria, such as rubrics, for assessing the evidence? Have they identified examples of student performance for each outcome at varying performance levels (e.g., below expectations, meeting, exceeding expectations for graduation)? 2. Valid Results. A valid assessment of a particular outcome leads to accurate conclusions concerning students’ achievement of that outcome. Sometimes faculty collect evidence that does not have the potential to provide valid conclusions. For example, a multiple-choice test will not provide evidence of students’ ability to deliver effective oral presentations. Assessment requires the collection of valid evidence and judgments about that evidence that are based on well-established, agreed-upon criteria that specify how to identify low, medium, or high-quality work. Questions: Are faculty collecting valid evidence for each targeted outcome? Are they using well-established, agreed-upon criteria, such as rubrics, for assessing the evidence for each outcome? Have faculty pilot tested and refined their process based on experience and feedback from external reviewers? Are they sharing the criteria with their students? Are they using benchmarking (comparison) data? 3. Reliable Results. Well-qualified judges should reach the same conclusions about individual student’s achievement of a learning outcome, demonstrating inter-rater reliability. If two judges independently assess a set of materials, their ratings can be correlated. Sometimes a discrepancy index is used. How often do the two raters give identical ratings, ratings one point apart, ratings two points apart, etc.? Data are reliable if the correlation is high and/or if the discrepancies are small. Raters generally are calibrated (“normed”) to increase reliability. Calibration usually involves a training session in which raters apply rubrics to pre-selected examples of student work that vary in quality, then reach consensus about the rating each example should receive. The purpose is to ensure that all raters apply the criteria in the same way so that each student’s product receives the same score, regardless of rater. Questions: Are reviewers calibrated? Are checks for inter-rater reliability made? Is there evidence of high inter-rater reliability? 4. Results Are Used. Assessment is a process designed to monitor and improve learning, so assessment findings should have an impact. Faculty should reflect on results for each outcome and decide if they are acceptable or disappointing. If results do not meet faculty standards, faculty should determine which changes should be made, e.g., in pedagogy, curriculum, student support, or faculty support. Questions: Do faculty collect assessment results, discuss them, and reach conclusions about student achievement? Do they develop explicit plans to improve student learning? Do they implement those plans? Do they have a history of securing necessary resources to support this implementation? Do they collaborate with other campus professionals to improve student learning? Do follow-up studies confirm that changes have improved learning? The Student Experience. Students should understand the purposes different educational experiences serve in promoting their learning and development and know how to take advantage of them; ideally they should also participate in shaping those experiences. Thus it is essential to communicate to students consistently and include them meaningfully. Questions: Are purposes and outcomes communicated to students? Do they understand how capstones support learning? Do they participate in reviews of the capstone experience, its outcomes, criteria, or related activities?