Capstone Assessment Project
Upcoming SlideShare
Loading in...5
×
 

Capstone Assessment Project

on

  • 2,311 views

This presentation examines inter-rater reliability of the HSP portfolio rubric in addition to student and faculty experiences with the Capstone course.

This presentation examines inter-rater reliability of the HSP portfolio rubric in addition to student and faculty experiences with the Capstone course.

Statistics

Views

Total Views
2,311
Views on SlideShare
2,311
Embed Views
0

Actions

Likes
1
Downloads
13
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Capstone Assessment Project Capstone Assessment Project Presentation Transcript

  • Capstone Portfolio Assessment Project Susan Kincaid and Jackie Baker-Sennett Department of Human Services and Rehabilitation Western Washington University This project was made possible through support from the Woodring College of Education Assessment Committee, WWU
  • Purpose of the Capstone Assessment Project (CAP)
        • Review recent literature on inter-rater reliability in portfolio assessment practices
        • Examine the rater reliability of the HS capstone portfolio rubric within sections of the course and across program locations (Bellingham, Everett, Cyber)
        • Determine student perceptions of portfolio learning and assessment
        • Examine instructor’s experiences with the portfolio assessment process and thoughts about teaching the Capstone Portfolio course
  • Background: The HSP Curriculum As part of a comprehensive curriculum revision in 2006 the Human Services Program moved to a standards based curriculum, with student learning assessed through a comprehensive Capstone Portfolio.
  • HSP Capstone Portfolio
      • A culminating course, project, and benchmark assessment in the Human Services major
      • Assesses students’ knowledge and skills as they relate to CSHSE National Standards (Council for Standards in Human Services Education) and program outcomes in the areas of technology, information literacy, critical thinking, and writing proficiency.
  • Portfolio Coursework
    • All HSP students take a sequence of two courses:
      • HSP 304: Introduction to Portfolio Learning (1 credit during first 5 credits in major)
      • HSP 495: Capstone Portfolio (4 credits during last 5 credits in major)
  • While most Human Services programs in the US require students to complete some type of showcase portfolio, to our knowledge Western Washington University is unique in requiring a comprehensive standards-based capstone portfolio as a benchmark assessment for program completion.
  • HSP 304- Portfolio Learning
    • Introduction to Portfolio Learning (HSP 304) is taken during the first quarter in the program and provides:
        • Overview of the HSP Curriculum.
        • Introduction to CSHSE National Standards and program specific outcomes (technology, information literacy, critical thinking, and writing proficiency.
        • Orientation to portfolio organization and design with all students receiving the Student Guide to Capstone Portfolio Process to support learning.
  • Portfolio Development Process
      • Instructors work with course plans that are aligned with CSHSE standards. Course plans typically identify at least one assignment per course appropriate for inclusion in the capstone portfolio.
      • Over a two-year course of study students are asked to update their portfolios in each class to reflect learning gained through coursework and field study.
  • Capstone Portfolio (HSP 495)
      • Prior to graduation students finalize and present portfolios in a culminating course: HSP 495-Capstone Portfolio
      • Portfolios can be paper, web, or CD-Rom based (or any combination)
      • Students address all CSHSE and program standards through written essays and evidence documented from course work and field study (see Student Guide, Appendix F)
  • Completion
  • Public Presentation
      • Six sections of the course were offered during Spring 2008 (113 students enrolled)
      • Four of the six sections hosted a public display of portfolios (Everett and Bellingham)
  • Spring 2008, Bellingham
  • Assessment
      • The portfolio is a high stakes exit assessment, necessitating reliable scoring methods
      • A department goal is to make sure we have an assessment tool and philosophy that reflects our diverse student body and variations in instructor’s pedagogical viewpoints.
  • The Rubric
      • Contains seven subscales assessing National Standards and Program Standards (critical thinking, information literacy, technology, writing proficiency), portfolio organization and demonstration of professional self
      • Used across program locations and sections
      • Total score serves as a benchmark exit assessment for completion of the major. (Current passing score is 70%)
      • Sub-scores are included as a performance assessment in the Woodring Information System (WIS) tracking system
    Handout
  • CAP Work to Date:
      • Reviewed portfolio rater reliability literature
      • Examined rater reliability of rubric used by WWU Human Services faculty in Bellingham, Everett, and Cyber (electronic portfolios)
      • Assessed student perceptions of portfolio learning (through written response and exit survey data)
      • Determined faculty perceptions of portfolio learning/teaching
  • Key Findings from Literature: Portfolio Rater Reliability Studies
        • Inter-rater reliability for published studies of portfolio assessments typically fall in the .55-.80 range (average reliability is approximately .64, marginal)
        • Greatest predictor of inter-rater reliability in medical student portfolios is “quality of reflection” accounting for 66% of variance (Driessen, 2006)
        • Reliable assessment must involve detailed criteria for the learner and the evaluator presented at the beginning of the learning experience (NCLRC, 2008)
        • Reliability increases with triangulation (inclusion of 3 or more pieces of evidence in a portfolio section)
        • Mixed findings regarding reliability when assessing own students portfolios.
    • Data includes:
      • Instructor’s total scores and subscale portfolio scores from 103 students enrolled in five sections of HSP 495 (Spring 2008).
      • 20% of portfolios from Bellingham, Everett, and Cyber sites were randomly selected for reliability coding (by 2 other raters).
    Examining Reliability of HS Capstone Rubric
  • Table 1: Mean (SD) Rubric Scores by Section 9.83 (.58) 8.79 (1.79) 9.0 (1.2) 9.91 (.31) 8.68 (1.06) Professional Self (10pt) 9.58 (1.44) 6.96 (2.29) 9.73 (.72) 9.64 (.67) 8.83 (.97) Technology (10pt) 10 (0) 6.92 (2.4) 9.73 (.70) 9.55 (.82) 8.40 (2.11) Information Literacy (10pt) 9.5 (1.73) 7.0 (2.45) 8.0 (2.83) 9.27 (1.0) 8.68 (1.17) Critical Thinking (10pt) 10 (0) 7.29 (2.14) 9.27 (1.32) 9.36 (1.57) 8.92 (1.09) Writing Proficiency (10pt) 39.79 (.72) 34.29 (5.65) 35.75 (3.8) 37.9 (4.41) 34.53 (3.93) National Standards (40pt) 10 (0) 8.21 (1.6) 9.50 (.96) 9.55 (.82) 9.10 (.85) Portfolio Organization (10pt) #5 98.71 #4 79.46 #3 90.98 #2 95.18 #1 87.13 Section Mean Total n= 103, Includes Bellingham, Everett, and Bremerton sections
  • Rubric Reliability: Analysis of Results
      • 21 randomly selected portfolios from 5 sections (12 Bellingham, 6 Everett, 4 Cyber)
      • 2 raters scored all 21 portfolios on all 7 dimensions
  • Intraclass Correlations Coefficient NS .491 Total Score   .009 .712 7. Professional Self   NS .275 6. Technology [Tied to ISTE standards]   NS .153 5. Information literacy [Tied to ALA standards]   NS .463 4. Critical Thinking   NS .259 3. Writing proficiency   NS .524 2. CSHSE National Standards [for HS Education]   NS .399 1. Organization P-value Intraclass Correlation Coefficient Dimension
  • Primary Concerns
    • Unacceptable estimates of reliability given that this is a high stakes assessment
    • Lack of consistency in overall scoring across sections of the course (variation in both means and standard deviations across sections)
  • POSSIBLE EXPLANATIONS
      • First cohorts of students to complete capstone portfolio.
      • Raters and instructors had not cross trained.
      • Restricted range of scores pointing to need to solidify categories rather than compare scores
      • Students had the opportunity to receive instructor and peer feedback and revise portfolios prior to grading and cross rating
  • Student Perceptions of Portfolio Learning
    • Data:
      • Written response surveys from 93 students
        • Bellingham=57
        • Everett= 24
        • Bremerton= 12
      • Senior exit survey data from 112 students
  • Student’s Written Responses Regarding Portfolio Learning
    • Summary of Strengths:
      • Comprehensive review and synthesis of two years in the program
      • The peer review process was effective for portfolio development/completion
      • A culminating project to be proud of
      • Introductory class in the first quarter is critical for preparation
  • Summary of Student Concerns
      • Some 304/495 instructors did not understand all standards.
      • More communication between faculty needed for scoring.
      • Promote digital portfolios (too much paper).
      • Portfolio checkpoints are needed throughout the program.
      • The Introductory course did not effectively prepare students (HSP 304).
      • Excessive workload in HSP 495 (class time not valuable).
      • The HS Program is too focused on standards.
      • The portfolio is not valuable for career related purposes.
      • Provide sample portfolios.
  • Student Exit Survey Results * * Includes 112 exiting seniors, Spring 2008. Statement Agree Disagree My capstone portfolio demonstrates that I meet CSHSE Standards. 56% 45% Creating a Capstone Portfolio was a valuable part of my learning experience. 27% 45% Overall: Satisfaction with HS quality of course content. 91% 9%
  • Instructor Experience with Portfolio Learning and Teaching
    • Includes:
      • Written notes from group meeting with 4 of 6 instructors (HSP 495)
  • Findings: Instructor’s Experiences with Portfolio Learning (4 of 6 faculty)
    • Limitations
      • Most students didn’t keep up with portfolios after their introductory course (no check points)
      • Extensive work for students and instructors
      • Class time is not valuable
      • Are students learning or just compiling? (Should we be charging tuition for this kind of course?)
      • Too much paper (make more use of CD-Roms)
      • Class doesn’t “feel good”. It seems arbitrary
  • Findings : Instructor’s Experiences with Portfolio Learning
    • Strengths
      • Discussion and peer review was very helpful.
      • Final showcase event in Bellingham and Everett was a good culminating event.
      • Students initially complained about quantity of writing, but many thanked instructors afterwards.
      • Capstone learning has potential, but we are not “there” yet.
  • Continuous Improvement…
    • Summer/Fall 2008
        • Revised course plan assignments and approach to teaching for the Intro. to Portfolio course, HSP 304.
        • Revised capstone portfolio grading rubric.
        • Changed some portfolio content from mandatory to optional.
        • Increased number of full time faculty teaching the Portfolio series.
        • Instructors shared examples of exemplary portfolios with students.
  • Continuous Improvement
      • Future Directions
      • Solidify categories and assign points to rubric.
      • Reanalyze reliability data to identify a Discrepancy Index. (Percentage of time raters score the same, are one category apart, etc…)
      • Film students discussing portfolio learning process and share with future groups.
      • Determine how to make most of class time (consider a small group tutorial model).
      • Continue to collect and analyze data from students (written responses and exit survey) during Spring, 2009
  • Future Directions - (Cont.)
      • Prior to Scoring : Systematize rater calibration process across all sites.
      • During Scoring : Engage in systematic dialogue among instructors while making scoring decisions. Ensure that irrelevant criteria is not being considered in the scores
      • Share and discuss findings of CAP with HSR department and other interested faculty (Winter, 2009)
  • “ A portfolio tells a story. It is the story of knowing. Knowing about things... Knowing oneself... Knowing an audience... Students prove what they know with samples of their work.” (Paulson & Paulson, 1991, p.2)
  • Resources
          • Baume, D. & Yorke, M. (2002) The reliability of assessment by portfolio on a course to develop and accredit teachers in higher education, Studies in Higher Education, 27 , 7–25.
          • Driessen E, van Tartwijk J, van der Vleuten C, Wass V. (2007). Portfolios in medical education: why do they meet with mixed success? A systematic review. Medical Education, 41, 1224-1233.
          • Driessen, E. W., Overeem, K., Van Tartwijk, J., Van Der Vleuten, C., & Muijtjens, A. (2006). Validity of portfolio assessment: Which qualities determine ratings ? Medical Education , 40 (9), 862-866.
          • Hamp-Lyons, L., & Condon, W. (2000). Assessing the Portfolio: Principles for Practice Theory and Research . Cresskill, NJ: Hampton Press.
          • Johnston, B. (2004). Summative Assessment of Portfolios: An examination of different approaches to agreement over outcomes. Studies in Higher Education . 29 (3), 395-412.
          • Kimbell, R., & Stables, K. (2008). Researching design learning: Issues and findings from two decades of research and development . NY: Springer.
  • Resources (Cont.)
          • McMullan M, Endacott R, Gray MA, Jasper M, Miller CML, Scholes J, Webb C. Portfolios and assessment of competence: a review of the literature. Journal of Advanced Nursing 2003; 41(3), 283-294.
          • McNamara, T. L. & Bailey, R. (2006). Faculty/Staff perceptions of a standards-based exit portfolio system for gradate students. Innovative Higher Education, 3 (2), 129-141.
          • National Capital Language Resource Center (2008). Portfolio Assessment in the Foreign Language Classroom. http:// www.nclrc.org/portfolio/about.html
          • Ostheimer, M. W., & White, E. M. (2005). Portfolio assessment in an American college. Assessing writing, 10 , 61-73.
          • Pitts, J., Coles, C. & Thomas, P. (1999) Educational portfolios in the assessment of general practice trainers: Reliability of assessors, Medical Education, 33 , 515–520.
          • Yao, Y., Thomas, M., Nickens, N., Downing, J. A., Burkett, R. S., & Lamson, S. (2008). Validity evidence of an electronic portfolio for preservice teachers. Educational Measurement: Issues and Practice , 10-24.