1. Ensuring the Alignment of Assessment wit hLearning Outcomes Working Smarter not Harder Dr. Michele Pinnock
2. From which angle are we viewing the outcome?• Are your learning outcomes attainable?• What are we valuing process or product?• Are you compromising the validity and reliability of the assessment• Is your expert knowledge; prior learning experience [when I was in College] syndrome; your knowledge of the standards and expectations of the real world all clouding your role as assessor?
3. CONCERNS• Students complain that they are being over- assessed• Students complain about not receiving meaningful feedback• Teachers complain about the ‘ marking’; we continue to over-assess and overworked ourselves
4. Valuing vs Crediting• What are you valuing and what is being credited by your course?• Are we valuing high order thinking?• We say we value completing assignments of time but do we credit such?
5. How valid are our assessments?• Are we testing what we say we would test?• Is our test mirroring the emphasis of our instruction?• Are we asking enough questions to make informed judgments about student performance?• Are we able to accurately predict future performance of our students accurately? What are we prepared to do about this situation?
6. The Message from the Assessment• Assessment sends clear messages to students regarding what is valued in a course.• Students who perceive that the assessment will test memorization are more likely to adopt surface approach to learning (Scouller, 1998)• Ensure that the message from assessment is in accordance with what is valued by the lecturer and the course.
7. Owning Intended Learning Outcomes• We must clearly communicate to our students the intended learning outcomes so they can own them and take responsibility for achieving them• Students are rarely concerned about learning outcomes they are interested in what is being valued by the assessment.
8. • Assessment defines what students regard as important, how they spend their time and how they come to see themselves as students and then as graduates.• Students take their cues from what is assessed rather than from what lecturers assert is important George Brown et al ,
9. How do we get students to learn what we intend for them to learn?• Constructive Alignment is the answer. ?
10. Constructive Alignment• This is a conscious effort to provide the learner with a clearly specified goal, a well designed learning activity or activities that are appropriate for the task, and well designed assessment criteria for giving feedback to the learner.
11. Aligning learning outcomes, learning and teaching activities and the assessment. Adapted from Biggs(1999) p 27
12. Constructive Alignment• Encourages clarity in the design of the curriculum,• Ensures that both students and teachers are cognizant of what is being valued and the weighting being ascribed to such• Offers transparency in the links between learning and assessment.• Encourages institutions to strive towards quality assurance and enhancement.
13. Think about Assessment at the beginning of the Learning ProcessAccording to Biggs(1995) in aligned courses,• objectives are usually clear, functioning at the high order level;• teaching methods usually elicit from students those learning activities that are likely to achieve those objectives; and• our assessment confirms that students are in fact learning what our objectives say they should learn (p.11)
14. Unaligned Course Teacher Student Intentions Activitye.g.- explain- relate- prove "Dealing with the test"- apply Exam Assessment e.g. - memorize - describe
16. Assessment influences Learning• Assessment methods and requirements probably have a greater influence on how and what students learn than any other single factor (Boud,1995).• How we go about assessing students can make significant impact on how well they achieve in their studies• Poor assessment design will lead students to ascribe to behaviours that are counter-productive to learning
17. Steps Towards Constructive AlignmentConsider assessment at the beginning of the course• Teachers need to be clear on attainment targets for their learners• Understand, explore the assessment possibilities for measuring these targets and then selecting the most relevant• Understand the process that will cause students to attain these targets
18. Designing Intended Learning Outcomes (ILO)• Bloom’s Taxonomy• Structure of Observed Learning Outcomes (SOLO) Taxonomy helps to map levels of understanding that can be built into the intended learning outcomes and to create the assessment rubrics.
19. Valuing competences [ Competence := knowledge + capacity to act upon it ] Have the student do something, and then measure product and/or processObjective !To learn to: analyze systems for... explain cause/effects... prove properties of...  compare methods of... ... SOLO = Structure of the Observed Learning Outcome
20. SOLOSOLO 5  to generalize "extended abstract"  to hypothesize  to theorize (qualitative levels) depthSOLO 4  to relate "relational"  to compare  to analyzeSOLO 3  to classify "multi-structural"  to combine (quantitative  to enumerate surface levels)SOLO 2  to identify "uni-structural"  to do procedure  to reciteSOLO 1 "pre-structural"  no understanding  irrelevant information  misses point
21. Graphic Illustration Legend immediately relevant aspects – given! SOLO related or hypothetical – not given! x irrellevant or inappropriate R student response• to theorize x x R• to generalize extended R• to hypothesize abstract R• to predict SOLO 5• to analyze x x• to relate relational R• to compare• to explain causes SOLO 4• to describe• to combine R1 multi structural R2 R• to classify R3• to perform algorithm SOLO 3• to do simple procedure• to define uni structural R• to identify• to recite SOLO 2 X R
22. Designing Student Activities• It is imperative that congruence exists between the – intended learning outcomes; – the learning activities and – the assessment task• If we value level 2 & 3 of the SOLO - Surface Understanding will be attained• Valuing level 4 & 5 will result in Deep Understanding
23. SOLO (verbs)SOLO 5 to theorize to apply theory (to distant problems) to hypothesize to put-into-perspective "extended abstract" to generalize to reflect to critize to judge to predict to discussSOLO 4 to analyze to apply theory (to near problems) to argue to reason about (reach conclusion) "relational" to relate to explain (cause-effect) to compare to explain (similarities-differences) to integrate to explain (strengths-weaknesses)SOLO 2+3 to describe to structure to enumerate to paraphrase "multi structural" at collate to do simple procedure & to combine to define "uni structural" to classify to identify / name to perform algorithm to recite
24. Technical Quality of Assessment Items• Cognitive complexity- learners engaged in ample range of intellectual activity• Content Quality- questions should permit learners to demonstrate their understanding on matter deemed important by experts• Accurately reflect the emphasis placed on important aspects of instruction• Transfer and generalizability• Language appropriateness• Fairness• Reliability
25. Copied from http://www.cmu.edu/teaching/assessment/basics/alignment.html TYPE OF LEARNING OBJECTIVE EXAMPLES OF APPROPRIATE ASSESSMENTSRecall Objective test items such as fill-in-the-blank, matching, labeling, orRecognize multiple-choice questions that require students to:Identify •recall or recognize terms, facts, and conceptsInterpret Activities such as papers, exams, problem sets, class discussions,Exemplify or concept maps that require students to:Classify •summarize readings, films, or speechesSummarize •compare and contrast two or more theories, events, or processesInfer •classify or categorize cases, elements, or events using establishedCompare criteriaExplain •paraphrase documents or speeches •find or identify examples or illustrations of a concept or principleApply Activities such as problem sets, performances, labs, prototyping,Execute or simulations that require students to:Implement •use procedures to solve or complete familiar or unfamiliar tasks •determine which procedure(s) are most appropriate for a given task
26. TYPE OF LEARNING OBJECTIVE EXAMPLES OF APPROPRIATE ASSESSMENTSAnalyze Activities such as case studies, critiques, labs, papers, projects,Differentiate debates, or concept maps that require students to:Organize •discriminate or select relevant and irrelevant partsAttribute •determine how elements function together •determine bias, values, or underlying intent in presented materialEvaluate Activities such as journals, diaries, critiques, problem sets,Check product reviews, or studies that require students to:Critique •test, monitor, judge, or critique readings, performances, orAssess products against established criteria or standardsCreate Activities such as research projects, musical compositions,Generate performances, essays, business plans, website designs, or setPlan designs that require students to:Produce •make, build, design or generate something newDesign This table does not list all possible examples of appropriate assessments.
27. Authentic Assessment• Authentic assessments require students to be effective performers with acquired knowledge.• Traditional tests tend to reveal only whether the student can recognize, recall or "plug in" what was learned most times out of context.• Authentic assessments present the student with the full array of tasks that mirror the priorities and challenges found in reality.
28. Assessing Learning in the Information Era• With this information age – students need to learn how not just to access, but evaluate and use appropriate information to solve real problems/issues• Let’s get them to make value judgments about the validity and reliability of the information• Remember that student learning does not just depend on what we teach• Spend more time and resources on assessment-
29. "The Learning Pyramid" Average passive retention rate Lecture 5% Reading 10% nt ga of l me en leve Audiovisual 20% ge Demonstration 30% Discussion group 50% Practice by doing 75% activestudent Teaching others 80% [Kilde: NTL Institute for Applied Behavioral Science, Bethel, Maine]
30. Biggs (2003) Concept Map showing relationships within the Curriculum Design Process.
31. Concerns• Incidental learning outcomes need to be identified and if of value incorporated as intended learning outcomes for the next course offering• Constructive alignment cannot be achieved or maintained in an institutional system that does not allow frequent review and modification by teachers
32. Putting it all TogetherThere are four stages in a constructively aligned curriculum (Biggs & Tang, 2007, pp. 54-55).1. Describe the intended learning outcomes in the form of a verb (learning activity), its object (the content), and specify the context and a standard the students are to attain.2. Create a learning environment using teaching/learning activities that address that verb and therefore are likely to bring about the intended outcomes.3. Use assessment tasks that also contain that verb, thus enabling you to judge with the help of rubrics if and how well students’ performances meet the criteria.4. Transform these judgments into standard grading criteria.
33. Examine carefully Goals/ Rationale of your course Operationalize the goals into intended learning outcomes (ILO)Choose carefully the Choose carefully the Assessment forms of teaching That measures That focus on these Outcomes these outcomes
34. Do we possess a vision of EvaluationUnless the purpose is perceived to be significant, the procedures are clearly understood, and the results are perceived to be useful and relevant, the individuals whose performances are being assessed will not do their best and will not facilitate the assessment process.Learners can resist even high stake test if they are deemed meaningless and unmanageable.
35. MEANINGFUL ASSESSMENT• Achieves specific goals or purposes that are significant to all especially students• Clear procedure, criteria, rubrics that are understood by ALL stakeholders• Produce results that provide clear directions for improved learning and instruction
36. Manageable AssessmentDescribes assessment that provides useful information on performance given time and other resources.• Planning and organizing the assessment• Collecting / analyzing assessment data• Recording and communicating the results of the assessment to stakeholders
37. Aligning Coursework• Coursework as grades or part of the learning process?• Are we engaged in formative assessment?
38. Proportions of Formative Assessment in Classrooms No Token Moderate Near-Total Formative Formative Formative Formativeassessment Assessment Assessment Assessment Class Activities Formative Assessment
39. Formative Assessment• Ongoing• Aids learning – Teachers feed information back to students in ways that enable the student to learn better, or – Students can engage in a similar, self-reflective process.• Helps students understand the rules of the game• Helps in the continuous monitoring of the quality of the work as both students and teacher strive towards attaining desired learning outcomes
40. Formative Assessment• Aids instruction – using evidence of student’s mastery status to make adjustments to instruction if the evidence suggests these adjustments are warranted
41. Formative assessment make teachers teach better and learners learn betterFormative assessment represents evidence-based instructional decision making
42. Steps for Establishing Formative Assessment That Solicits Classroom Climate Shifts1. Distribute classroom climate guidelines2. Seek trust constantly and nurture it seriously3. Model and reinforce appropriate conduct4. Solicit students’ advice on classroom climate5. Assess students’ relevant affective status
43. Concerns/ Dissonance• Formative assessment though it elicits deeper understanding among learners it will not necessarily improve student’s scores on the standardized examinations
44. How well do teachers manage assessment ?• Many assessment procedures are labour intensive and time consuming on the part of teachers• Teachers spend 15 -20 minutes outside of school grading essay assignments. Swain & Swain (1999)
45. VALIDITYThe accuracy of a test to test what it is suppose to test.• Number of items• The types of items• The time allotted for the completion of the assessment• Quality of task instructions• The weighing of each assessment in relation to the course objective and the course in its entirety
46. RELIABILITYThis speaks to the consistency of the assessment in determining student performance.If assessment is not reliable then decisions made based on such will lead to problems.
47. Issues associated with some frequently used test formatsGROUP ASSESSMENT• Deciding on group members• Assessing the input of each members process vs. product• Assessing whether all members possess the knowledge/ skill/ attitude being examined
48. Formative assessment aids learning by generating feedback information that is of benefit to students and to teachers.Feedback on performance, in class or on assignments, enables students• to restructure their understanding /skills and• build more powerful ideas and capabilities.
49. Conditions necessary for students to benefit from feedbackAccording to Sadler (1989):a) possess a concept of the goal/standard or reference level being aimed forb) compare the actual (or current) level of performance with that goal or standardc) engage in appropriate action which leads to some closure of the gap
50. Issues with Feedback• Students not clear on learning outcomes• Students not sure what standards/ expectations look like• Students not sure about what is actually necessary to help students close the gap.Example of a comment -‘this essay is not sufficiently analytical’)
51. Good feedback practice:1. Facilitates the development of self-assessment (reflection) in learning.2. Encourages teacher and peer dialogue around learning.3. Helps clarify what good performance is (goals, criteria, expected standards).4. Provides opportunities to close the gap between current and desired performance.5. Delivers high quality information to students about their learning.6. Encourages positive motivational beliefs and self-esteem .7. Provides information to teachers that can be used to help shape the teaching.
52. FEEDBACK now Feed-forward• Feedback is the most significant factor in student progress• For feedback to be effective students must be able to apply the comments they receive to improve their chances of success with the following assessment• High level of feedback results in high quality student outcomes and so expectations are achieved
53. Feedback• Comparing students is not of importance because our objectives/ criterion are written in terms of individual mastery of a course• Forms of feedback- checklists/ rubrics/ codes/ regular conferencing
54. feedback• How can we get students to read the feedback and do something with it, react to it, respond to it, use it as a feed-forward – something that will make the next assignment better.• We need to design ways of giving feedback much faster, maybe emailing
55. ASSESSMENT INSTRUMENTS Checklists Rating Scales Rubrics• Tools used to measure performance on assessments.
56. RUBRICSRubrics are a scoring scale consisting of a set of criteria that describe what expectations are being assessed/evaluated and descriptions of levels of quality used to evaluate students work or to guide students to desired performance levels.
57. Rubrics• Communicate expectations and aid instruction• Indicates quality and quantity of student learning• Should be given with the assessment task• allow students the ability to assess their own work
58. Sample of a Maths Rubric 4 3 2 1Demonstrates a Demonstrates an Demonstrates a Demonstrates littlethorough understanding of partial understanding understanding of theunderstanding of the the main concepts. of the main concepts. main concepts.main concepts.Very capably and Performs With some Uses and performsindependently selects algorithms assistance, performs simple algorithmsthe most efficient accurately and is algorithms with some with some accuracyalgorithm and usually accurate. accuracy present. present. Assistancesolutions are accurate. is usually required.Independently applies Applies the steps of Some effectiveness Often forgets thethe steps required with a problem and is is evident when steps in a problem,a very high degree of usually accurate. following and some accuracy notedaccuracy. applying the steps of some of the time. a problem.Thorough analysis of Analysis of the Analyzes the problem Very little evidence ofthe problem with problem is evident, with some success, analysis. Someaccurate solutions. considerable accuracy needs to educated guesses. accuracy. improve. Accuracy is weak.
59. Assessment InstrumentsList of http://course1.winona.edu/shatfield/air/rubrics.htm A s http://academic.scranton.edu/department/assessment/k som/s e s http://flightline.highline.edu/socc/ToolsResources/Tools/ s samplerubrics.htm m e n t t o o l sWritten Papers http://www.fordham.edu/halsall/med/rubric.html
61. ActivityUsing examples of assessment strategies differentiate between• Traditional vs Alternative Assessment• Paper & pencil vs performance Assessment• Objective vs Subjective• Process vs Product• Authentic Assessment
62. Selecting Assessment Strategies• Ensure that the strategy does mirror the objectives• Are competencies and skills being adequately measured?• At the tertiary level we rely heavily on three main assessment strategies – class presentations, essays and examination. Recently group work has been added to the list
63. Alternate Assessment• Portfolios where a complexity of meaningful tasks geared towards enhancing learning are included. Please note that for portfolios assessment is the secondary purpose• Simulations;• Work experience• Projects – Product? Process? Or both product and process• Debates• Displays
64. • Concern – external examiners – need to validate grades and thus need proof of assessment. Answer• design your assessment framework prior to the course; videotape; and even invite them to play a passive role while you assess
65. Designing Assessments that Capture Student Interest and Promote Learning• Using the Document Camera• Discussion Board• Weblogs• Chatrooms• Making movies/digital stories• Podcasts
66. Consider all the factors prior to…• Emailing of assignments- implications- who will print; format etc• Digital form of assignment – copy and paste etc• Availability of resources/ materials• Guarding against plagiarism
67. Enhancing Reliability ..• Frequency and duration of assignments• More assessment tasks lead to increase in reliability – greater consistency• More and thus shorter assessment items – students will get more guidance on their performance which often leads to less anxiety HOWEVER In the real world students will have to face problems that are multifaceted and so the assessment would not be accurately measuring their ability to solve such problems• Sometimes in a bid to ensuring reliability validity is sacrificed• Using more than one scorer increases reliability• Defining marking scheme
68. Good Assessment Principles• Use a range of assessment tasks to ensure balance of coverage and depth• Validity achieved when assessment items measure the kind of knowledge desired – Relevance• Emphasis of instruction is in tandem with the emphasis of assessment - Balance• Assessment tasks are pitched at the levels outlined in the learning outcomes
69. Evaluation as a Tool for Empowering our Students• Ensure that all students have an equal opportunity to achieve to the best of their ability – Consider the Learner - Test Anxiety; workload – Assessment of, as and for Learning
70. Integrating Assessment• Look at the assessment• How do assessment pieces relate to other pieces over the semester• Are students being asked to do a similar piece of work in another course – integration within disciplines (departments) and with other disciplines (departments)• Look at the weightings• Who will be marking – set your schedule• What type of feedback will be given and when
73. Moving Forward…• Constructive alignment cannot be achieved or maintained f we are not actively engaged in systematic and frequent reviews and modification of course and programme offerings• We need to ensure that we use a modest number of extraordinarily important curricular aims• We must be committed to analyzing and compiling reports for each outcome / course
74. Conclusion• Let’s stop marking• Let’s move towards ensuring that there is a seamless transition between assessment and learning• Ensure that assessment is a meaningfully integrated into the learning process• Let’s ensure that opportunities for meaningful and timely feedback is intertwined with instruction
75. Let’s Work Smarter not Harder• I thank you
76. References• Biggs, J. (1999). Teaching for Quality Learning at University, (SRHE and Open University Press, Buckingham)• Biggs, J. (2003). Aligning Teaching and Assessment to Curriculum Objectives, (Imaginative Curriculum Project, LTSN Generic Centre)• Biggs, J and Tang C. (2007). Teaching for Quality Learning at University, (McGraw-Hill and Open University Press, Maidenhead)• Sadler, D. (1989) Formative assessment and the design of instructional systems Instructional Science 18, 119-144.
77. Aligning Intended Learning Outcomes, Teaching, and Assessment Tasks