Outcomes Assessment Overview

248 views

Published on

general introduction to broad outcomes assessment issues and approaches in higher education

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
248
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
7
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • in a societal and political context where accountability seems to be ascendant, and accreditation falls somewhere in between
  • Power corrupts, and PowerPoint corrupts absolutely
  • Whole group br ainstorm At its simplest, “The process of gathering information to make decisions”
  • Emhasis in standards is on 2 distinct approaches--institutional effectiveness & student learning outcomes—but more critical distinction derives from underlying worldviews, reflecting more than just particular programs or strategies people use— what’s competing are fundamental, underlying worldviews that shape and inform the ways people approach this work—sometimes at a conscious level, but often not. For an excellent in-depth elaboration of some of the historical roots and present-day consequences of these worldviews, see Lorrie Shepard’s article in the October, 2000 Educational Researcher (vol. 29, #7, p. 4-14) entitled ‘The role of assessment in a learning culture.” Need rigorous, useful peer review process that links or balances accountability & assessment perspectives Challenge is involving thoughtful peers, producing work meaningful for campus and translatable to external audiences
  • EXCELLENCE: Performance, not prestige Results, not resources/reputation Incorporate that notion of excellence into peer review processes, I.e., accreditation ACCOUNTABILITY Professional accountability, not bureaucratic accountability—a kind of accountability to ourselves, our peers, our students that involves continual self-reflection and learning—the kind most of us are already doing CAMPUS COMMUNITY Overcome traditional isolation of academic work and higher education culture Important to struggle to define collectively—which is not at all easy for us to find the time for, let alone to do—the goals/ends we have in common for our students, what we think the institution should stand for and be about
  • (individually, then small groups) Think about the intro course for your discipline or program. What core learning goals do you have for students who take and complete that course?
  • Traditional question: what do we teach? What do we cover? It’s important to understand that the conversations/debates we have about these areas are informed and shaped by our operating assumptions/personal theories about the more fundamental issues of knowledge , learning , assessment We need to examine these often tacit assumptions and re-frame the questions—
  • Knowledge=“Competence with respect to valued enterprises”-- Etienne Wenger, Communities of Practice Understanding it requires using it Using it involves both changing the user’s view of the world & adopting the belief system of the culture in which it is used
  • Discussion—see handout
  • Traditional question: How do we grade? It’s important to understand that the conversations/debates we have about these areas are informed and shaped by our operating assumptions/personal theories about the more fundamental issues of knowledge , learning , evidence Evidence involves: Degrees of certainty/uncertainty Subjective interpretations, even about “objective” data Meaning-making and judgment processes for teachers, students, and assessors
  • Assessment: the process of gathering evidence to make judgments toward the goal of improvement 1) Initial reaction to calls for assessment is often to do what’s easiest, cheapest, quickest way to get “them” (accreditors, legislators, administrators, ______) off “our” backs—this approach rarely serves the “assessment as learning” framework very well 2) at least as much as the precision of the evidence you gather What you do with and how you use the evidence or data you gather is far more important than the “rigor” of the evidence—rigor is fine, but someone will always want to argue about the evidence anyway, so make sure you spend time and energy figuring out how to have constructive arguments rather than rigorous reports that get ignored 3) to assess performance over time Multiple perspectives tend to produce higher-quality judgments about evidence and what to do about it—you rarely have complete information, but multiple data sources provide a clearer picture [2-minute table group QUESTION INTERLUDE]
  • Traditional question: How do we teach it? It’s important to understand that the conversations/debates we have about these areas are informed and shaped by our operating assumptions/personal theories about the more fundamental issues of knowledge , learning , assessment More often than not these are tacit assumptions; if we can make them visible we can get clearer about our understandings and their relationship to current scholarship on teaching and learning. LEARNING: …Being able to repeat facts and plug numbers into formulae to get the right answers is handy, even essential. But it is not what education is fundamentally about… Learning should be about changing the ways in which learners understand, or experience, or conceptualize the world around them … (Paul Ramsden, 1988, 1992)
  • We laugh at this, in part because it’s a laugh of recognition, isn’t it? And underneath the exaggeration and humor there’s more truth to this perspective than we care to admit most of the time—not to mention the fact that it reflects fairly common perceptions of our business.
  • Notion of “cognitive apprenticeship”-- Enculturating student into specific “communities of practice ” Meaningful problems, tasks situated in specific domains Include at least some of the “messy” attributes of real-world problems Preferably linked to student interests Learning as qualitative changes in how people interpret core concepts, reason about key content areas
  • COURSE: Grading/evaluation processes regarding core concepts/themes, skills and ways of reasoning Classroom assessment efforts (Cross/Angelo) DEPT./PROGRAM Faculty consensus around shared goals & essential student learning outcomes/abilities Adequate (not minimum) performance standards INSTITUTION IE approaches General education abilities Student tracking processes/indicators
  • Pierce: common rubrics Green River: community rubrics Big Bend, Spokane, Spokane Falls, Yakima Valley, … Capstone courses (LCC) Artifact collection (Johnson County CC)
  • Student learning outcomes focus Institutional effectiveness focus Both are driven by critical underlying processes reflected by Alverno definition of assessment-as-learning
  • Cf Peter Ewell’s plenary speech to AAHE Assessment Forum, June 2002 (http://www.aahe.org/assessment/2002/ ) No clear evidence that efforts to “make people accountable” to some outside group actually improves quality Authentic assessment critical at all levels—classroom, program and institution Nancy Cochran: “We have been overzealous in our application of a useful but limited technique [statistical analysis in social sciences]…” “ An Essay on the Inappropriateness of Program Evaluation,” 1980 Serious potential unintended consequences w/ quantified assessments, including the fact numbers have an air of authenticity about them even if they are inaccurate or false, and the way quantified standards can become ends in themselves and thus distort behavior in unproductive ways Peter Johnston: need to question the assumption “that it is possible to obtain an objective, valid unbiased empirical description of human learning activity and that it will serve educational stakeholders to do so …” “ Constructive Evaluation and the Improvement of Teaching and Learning,” 1989 We can’t treat the world as it were simply sitting there available to our assessment of it—our actions and our language affect the environment in the process of assessment, and if we want to understand our world, let alone change it, we need to attend carefully to those effects The point is that no matter how we go about educational evaluation, it involves interpretation, and we simply need to be conscious of that and figure out where possible to articulate and improve the judgment processes involved.
  • Encouraging and supporting this kind of work among your peers, in your department, across the campus, …??? Role of accreditation in terms of helping and/or hindering this work?
  • Collaborative process with colleagues 2) in student performances or products 3) what’s good enough? What meets the standards you’ve set for your students--and do they know what they are??? 4) Focus on brainstorming ways to help students achieve the learning goals you have for them (scholarship of teaching, and of assessment)
  • Outcomes Assessment Overview

    1. 1. Assessment as a Way of Understanding & Improving Student Learning William S. Moore, Ph.D. Policy Associate, Assessment, Teaching & Learning WA State Board for Community & Technical Colleges bmoore@sbctc.ctc.edu 360-704-4346 College of the Siskyous January 12, 2007
    2. 2. As you can clearly see on slide 397… <ul><li>Courtesy of “Dilbert” & Scott Adams </li></ul>Oh, no—not another case of PowerPoint poisoning!!!
    3. 3. Overview of Workshop <ul><li>Understanding the Context </li></ul><ul><li>Beginning with the End in Mind </li></ul><ul><li>Facilitating Student Learning </li></ul><ul><li>Making Judgments about Progress </li></ul><ul><li>Putting the Pieces Together </li></ul><ul><li>Exploring other Questions </li></ul><ul><li>Focusing on Assignments as Assessments </li></ul>
    4. 4. “ Assessment is…”
    5. 5. Accreditation: Caught between Competing Worldviews? <ul><li>Compliance: “prove” </li></ul><ul><li>Self-reflection; “improve” </li></ul>“ Them” (external) Measures & answers “ Us” (internal) Evidence & better-informed judgments ACCREDITATION WHY? WHO? WHAT? ACCOUNTABILITY ASSESSMENT
    6. 6. The Promise of Assessment <ul><li>Create a new notion of educational excellence for higher education </li></ul><ul><li>Reframe our understanding of accountability </li></ul><ul><li>Revive/strengthen a sense of campus community by focusing on significant, collective purposes </li></ul>
    7. 7. Collective Sense of Purpose Assessment...requires us to work together , and to do unfamiliar things like setting common goals and standards, devising methods of assessment, interpreting the results, and using them to improve and coordinate our teaching. [It thus] possesses all the appeal and efficiency of committee work, in particular the kind visited upon us by administrators. Robert Holyer, Change , Sept./Oct., 1998
    8. 8. Beginning with the End in Mind
    9. 9. Underlying Perspectives Drive Key Questions Curriculum What do we want students to know & be able to do?? Knowledge
    10. 10. Knowledge as a Set of Tools <ul><li>“ Situated” or grounded in specific contexts in which it is used (and learned) </li></ul><ul><li>Expertise as body of knowledge organized around “big ideas,” not isolated facts </li></ul>
    11. 11. Making Judgments about Progress
    12. 12. Underlying Perspectives Drive Key Questions Assessment What do we want students to know & be able to do?? Knowledge Curriculum Evidence How do we judge competence on key outcomes?
    13. 13. A Sampling of Assessment Approaches <ul><li>Research papers, essay tests </li></ul><ul><li>Self-evaluations </li></ul><ul><li>Interviews </li></ul><ul><li>Performance tasks (e.g., cases, problems, etc.) </li></ul><ul><li>Multiple-choice tests </li></ul><ul><li>Projects, field work </li></ul><ul><li>Standardized tests, surveys </li></ul><ul><li>Peer evaluations </li></ul><ul><li>Portfolio collections of work </li></ul><ul><li>External assessor ratings </li></ul><ul><li>Focus groups </li></ul>
    14. 14. Core Principles of Assessment <ul><li>Assess the things that really matter, not just the things easily assessed </li></ul><ul><li>Emphasize the quality and quantity of conversations about assessment evidence </li></ul><ul><li>Use a variety of approaches and multiple indicators </li></ul>
    15. 15. Facilitating Student Learning
    16. 16. Underlying Perspectives Drive Key Questions How do we judge competence on key outcomes? What do we want students to know & be able to do?? Knowledge Assessment Curriculum Pedagogy Grading Learning How do we promote learning most effectively?
    17. 17. College Learning??? <ul><li>Basically, you learn two kinds of things in college: </li></ul><ul><li>Things you will need to know in later life (2 hours)… </li></ul><ul><li>Things you will NOT need to know in later life (1198 hours). These are the things you learn in classes whose names end in ‘-ology’, ‘-osophy’, ‘-istry’, ‘-ics’, and so on. The idea is, you memorize these things, then write them down in little exam books, then forget them. If you fail to forget them you become a professor and have to stay in college the rest of your life. </li></ul>Dave Barry, 1981
    18. 18. Learning as Deep Understanding <ul><li>Collaborative work around authentic, “situated” activities </li></ul><ul><li>“ Learning to be” vs. “learning about” </li></ul><ul><li>Teacher’s roles: modeling, scaffolding, fading, coaching </li></ul>
    19. 19. Putting the Pieces Together
    20. 20. <ul><li>Course </li></ul><ul><li>Department/Program </li></ul><ul><li>Institution </li></ul>Multiple Levels of Assessment
    21. 21. General Education Assessment Decisions: <ul><li>Definition : what matters? </li></ul><ul><li>Focus : individual students or programs? </li></ul><ul><li>Level of emphasis : what’s a program? </li></ul><ul><li>Nature of evidence : direct or indirect? </li></ul><ul><li>Approach : external or embedded? </li></ul>
    22. 22. Student Learning Institutional Outcomes Effectiveness Reflecting regularly on the strengths & weaknesses of the institution Asking fundamental questions about learning and the conditions for learning Observing & judging performance based on explicit criteria Providing feedback based on those judgments
    23. 23. Scary Issues Around Assessment <ul><li>Assuming “accountability” leads to real improvement </li></ul><ul><li>Getting lost in administrivia </li></ul><ul><li>Worshipping numbers </li></ul><ul><li>Obsessing over “objectivity” & “being scientific” </li></ul>
    24. 24. Every complex question has a simple answer… And it’s wrong. H.L. Mencken
    25. 25. A Sampling of Assessment Resources/Gatherings <ul><li>The Research and Planning Group for California Community Colleges </li></ul><ul><ul><li>http://www.rpgroup.org/ </li></ul></ul><ul><li>Academic Quality Improvement Program </li></ul><ul><ul><li>http://www.aqip.org/ </li></ul></ul><ul><li>Western Assessment Conference </li></ul><ul><ul><li>http://business.fullerton.edu/events/AssessmentConf/ </li></ul></ul><ul><li>International Assessment & Retention Conference http://www.naspa.org/assessment/index.cfm </li></ul><ul><li>Partial listing of other assessment conferences http://www.assessmentconferences.com/ </li></ul><ul><li>WA Assessment, Teaching, & Learning site http://www.sbctc.ctc.edu/college/e_assessment.aspx (annual conference, regular e-newsletter) </li></ul>
    26. 26. Exploring Other Questions
    27. 27. Focusing on Assignments as Assessments
    28. 28. Clarifying ‘Good Work’: Making Implicit & Private Judgments Explicit & Public <ul><li>Review concrete examples of real work </li></ul><ul><li>Develop shared notions of quality </li></ul><ul><li>Define what constitutes adequate evidence of quality </li></ul><ul><li>Discuss learning experiences that help foster the desired performances </li></ul>
    29. 29. Good assessment tasks are interchangeable with good instructional tasks. Lorrie Shepard “ The role of assessment in a learning culture,” 2000
    30. 30. The best assessment tools are the minds of teachers and learners

    ×