Designing Outcomes-based Education Assessment Tasks

  • 2,308 views
Uploaded on

Designing Assessment Tasks for Outcomes-based Teaching and Learning Methdology

Designing Assessment Tasks for Outcomes-based Teaching and Learning Methdology

More in: Education , Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
  • Dear Ma'am, can I ask a copy of your presentation, just need it in our special topics. Salamat po and God Bless!
    Are you sure you want to
    Your message goes here
No Downloads

Views

Total Views
2,308
On Slideshare
0
From Embeds
0
Number of Embeds
2

Actions

Shares
Downloads
0
Comments
1
Likes
8

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. DesigningAssessment TasksbyMs. Sheryl B. SatorreIn-Service Training 2013University of Cebu – Main CampusMay 28 – 31, 2013(iamsbsatorre@gmail.com)
  • 2. Outline:1. Understanding Assessment2. Steps in Designing Assessment Tasks3. Workshop # 3 – Designing ATs6/5/2013Preparedby:SBSatorre2
  • 3. Understanding Assessment6/5/2013Preparedby:SBSatorre3
  • 4. What is Assessment?Assessment is the ongoing process ofgathering, analyzing and reflecting onevidence to make informed andconsistent judgments to improve futurestudent learning.6/5/2013Preparedby:SBSatorre4
  • 5. In layman’s language, how isthe process of assessmentdescribed?• Plan it!• Do it!• Check it!• Revise it!• Repeat it!6/5/2013Preparedby:SBSatorre5
  • 6. Uses of AssessmentPlanning, Conducting, and EvaluatingInstruction• assessment can provide information to guideinstructional decisions• prior to instruction—planning for instruction andsubsequent assessment• during instruction—determining effectiveness ofinstruction and whether reinstruction is needed• following instruction—determining if revisions arenecessary for next period, next class meeting, ornext year6/5/2013Preparedby:SBSatorre6
  • 7. Uses of AssessmentDiagnosing Student Difficulties• assessment prior to instruction in order to determinewhat students know and can do• important in helping teachers plan for instructionPlacing Students• assessment for purposes of grouping studentsbased on ability, organizing students for groupwork, sequencing of coursework, etc.6/5/2013Preparedby:SBSatorre7
  • 8. Uses of AssessmentProviding Feedback (Formative)• assessment can provide feedback to studentsregarding their academic progress• important to provide this type of feedback in anongoing mannerGrading and Evaluating Learning(Summative)• formal assessments of learning following thecompletion of instruction• typically used to communicate results tostudents, parents, and others6/5/2013Preparedby:SBSatorre8
  • 9. 3 Main Purposes for Assessment6/5/2013Preparedby:SBSatorre9
  • 10. • Assessment for Learning(AfL)occurs when teachers useinferences about studentprogress to inform their teaching.(formative)• Assessment as Learning(AsL)occurs when students reflect onand monitor their progress toinform their future learning goals.(formative)• Assessment of Learning(AoL)occurs when teachers useevidence of student learning tomake judgments on studentachievement against goals andstandards. (summative)embedded inthe TLAsOccurs at theend of theprocess, task,or period6/5/2013Preparedby:SBSatorre10
  • 11. Formal vs. Informal AssessmentFormal Assessment Methodsplanned in advance of their administrationlack spontaneitytypically occur at the end of instructionstudents are aware of these methodsexamples include chapter tests, finalexams, graded homework, etc.Informal Assessment Methodsmore spontaneous; less obvioustypically occur during instructionexamples include teacher observations andquestions6/5/2013Preparedby:SBSatorre11
  • 12. Qualitative vs. QuantitativeAssessmentQuantitative Assessment Methodsyield numerical scoresmajor types include teacher-constructedtests, standardized tests, checklists, and ratingscalesQualitative Assessment Methodsyield verbal descriptions of characteristicsmain types include teacherobservations, anecdotal records, and informalquestions6/5/2013Preparedby:SBSatorre12
  • 13. Formative vs. SummativeFormative Evaluationdecision making that occurs during instruction forpurposes of making adjustments to instructionmore of an evaluation of one’s own teachingrather than of students’ workmay be based on formal or informal methodsSummative Evaluationoccurs at the end of instruction (e.g., end ofchapter, end of unit, end of semester)typically used for administrative decisions(e.g., assigning grades, promoting/retainingstudents)based solely on formal assessment methods6/5/2013Preparedby:SBSatorre13
  • 14. Standardized vs. NonstandardizedAssessmentStandardized Assessment Methodsadministered, scored, and interpreted in identicalfashion for all examineespurpose is to allow educators to comparestudents from different schools, states, etc.examples include SAT, GRE, ITBS, CAT, PRAXISNonstandardized Assessment Methodstypically made by teachers for classroom usepurpose is to determine extent to which subjectmatter is being taught and learned6/5/2013Preparedby:SBSatorre14
  • 15. Norm-Referenced vs. Criterion-Referenced AssessmentNorm-Referenced Assessment Methodsshow where an individual student’s performancelies in relation to other studentsstandardized tests are usually norm-referencedresults are quantitativestudent performance is compared to norm groupCriterion-Referenced Assessment Methodscompare student performance to pre-establishedcriteria or objectivesresults are quantitative, qualitative, or bothalso known as mastery, objectives-referenced, orcompetency tests6/5/2013Preparedby:SBSatorre15
  • 16. Traditional vs. AlternativeAssessmentTraditional Assessment Methodsprocedures such as pencil-and-paper tests andquizzesonly one correct response to each test itemeasily and efficiently assess many studentssimultaneouslyencourage memorization of facts, etc.Alternative Assessment Methodsmore appropriate for hands-on, experientiallearninginclude authentic assessment (involve realapplication of skills beyond instructional context)6/5/2013Preparedby:SBSatorre16
  • 17. Objective vs. Subjective AssessmentObjective Assessment Methods―objective‖ refers to method of scoring (no judgments)contain only one correct answerexamples: multiple-choice, true-false, matching itemsalso known as structured-response, selected-response, teacher-supplied itemsSubjective Assessment Methodsscoring involves teachers’ subjective judgmentsseveral possible correct responses or single correctresponse with several ways to arrive at that answerexamples: short-answer and essay itemsalso known as open-ended, constructed-response, supply-type items6/5/2013Preparedby:SBSatorre17
  • 18. Ethical Issues Related to AssessmentTeacher Responsibilities in the Classroom• ensuring that students are properly motivated to dotheir best on any type of assessment method, thatall types of assessment methods are administeredfairly, and results are interpreted appropriatelyMotivating Students• should not try to trick students on classroomassessments• provide encouragement• familiarize students with assessment procedures(i.e., develop students’ ―testwiseness‖ skills)6/5/2013Preparedby:SBSatorre18
  • 19. Ethical Issues Related to AssessmentTest Administration• establishes a positive environment within theassessment situation• discourages cheatingInterpretation of Test Results• tests do not result in measures of the entire person• interpretation should be limited to only those skillsmeasured by a particular test• avoids overgeneralizations6/5/2013Preparedby:SBSatorre19
  • 20. Characteristics of ExemplaryAssessment Task (Huba & Freed)• Valid  yields useful information to guidelearning• Coherent  is structured so that activitieslead to desired performance product• Authentic  addresses ill-definedproblems/issues that are enduring oremerging• Rigorous  requires use of declarative andfunctional knowledge6/5/2013Preparedby:SBSatorre20
  • 21. Characteristics of ExemplaryAssessment Task (Huba & Freed)• Engaging  provokes student interest andpersistence• Challenging  provokes, as well asevaluates, student learning• Respectful  allows students to reveal theiruniqueness as learners• Responsive  provides feedback tostudents learning to improvement6/5/2013Preparedby:SBSatorre21
  • 22. Steps in Designing OBE-based Assessment Task1. Choose the right assessment task/method.2. Choose the right student activities tocomplete the assessment task/method.3. Create the scoring or grading criteria.6/5/2013Preparedby:SBSatorre22
  • 23. 1. Choose the rightassessment task ormethod.6/5/2013Preparedby:SBSatorre23
  • 24. 1. Is the assessment task aligned with thesubject intended learning outcome?2. Is the assessment task reflect a relativeimportance to the subject intendedlearning outcome?3. Is the assessment task realistic to thestudent?4. Is the assessment task measurable?5. Are the resources needed to carry outthe assessment task available?6/5/2013Preparedby:SBSatorre24
  • 25. Common Verbs inthe ILOsPossible Assessment TasksDescribe Assignment, Essay question examExplain Assignment, Essay question exam, Oral examIntegrate Project, AssignmentAnalyse Case Study, AssignmentApply Project, Case Study, ExperimentSolve Case Study, Project, ExperimentDesign, Create Project, ExperimentReflect Reflective journal/diary, Portfolio, Self-assessmentCommunicate A range of Oral, writing or listening tasks6/5/2013Preparedby:SBSatorre25
  • 26. Possible Assessment Methods forthe Computing Field• Practical Work• ComputerSimulations• Laboratory Work• Problems to Solve• Reflective LearningStatements• Self-test• Final Exams• Essays• Assignments• Field Reports• Article Review• Group Work• Portfolios• Performances &Presentations• Projects• Independent Study• Learning Contracts6/5/2013Preparedby:SBSatorre26
  • 27. • If you want a written assessmentinstrument, which of the following would youchoose? Consider the best uses ofessays, reports, reviews, summaries, dissertations, theses, annotated bibliographies, casestudies, journal articles, presentations and exams.• Should the method be time-constrained? Examsand "in-class" activities might well be the mostappropriate for the occasion. Time constrained testsput students under pressure, but are usually fairlygood at preventing cheating.Specific Guide Questions(adapted from 500 Tips on Assessment (SallyBrown, Phil Race and Brenda Smith, 1996)6/5/2013Preparedby:SBSatorre27
  • 28. • Is it important that the method you chooseincludes cooperative activity? If it is important,you might choose to assess students in groups,perhaps on group projects, poster displays orpresentations.• Is it important that the method you chooseincludes cooperative activity? If it is important,you might choose to assess students in groups,perhaps on group projects, poster displays orpresentations.6/5/2013Preparedby:SBSatorre28
  • 29. • Is a visual component important? When it is, youmight choose portfolios, poster displays, critiquesessions or exhibitions.• Is it important that students use informationtechnology? When this is the case, computer-based assessments may be best, either gettingstudents to answer multiple-choice questions, orwrite their own programs, or prepare databases, orwrite information stacks for hypertext, or material foruse in CD-ROM systems or on the Internet.6/5/2013Preparedby:SBSatorre29
  • 30. • Do you wish to try to assess innovation orcreativity? Some assessment methods that allowstudents to demonstrate these include:performances, exhibitions, posterdisplays, presentations, projects, student-ledassessed seminars, simulations and games.• Do you want to encourage students to developoral skills? If so, you might choose to assesspresentations, recorded elements of audio andvideo tapes made by students, assesseddiscussions or seminars, interviews or simulations.• Do you want to assess the ways in whichstudents interact together? You might then assessnegotiations, debates, roleplays, interviews, selection panels, and casestudies.6/5/2013Preparedby:SBSatorre30
  • 31. • Is the assessment of learning done away fromthe institution important? For example, you maywish to assess learning done in the work place, inprofessional contexts or on field courses. You maychoose to assess logs, reflective journals, fieldstudies, case studies or portfolios.• Is your aim to establish what students are ableto do already? Then you could try diagnostic tests(paper-based or technology-based), profiles, records of achievement, portfolios6/5/2013Preparedby:SBSatorre31
  • 32. 2. Choose the rightstudent activities tocomplete the assessmenttask/method.6/5/2013Preparedby:SBSatorre32
  • 33. Are the student activities tocomplete the assessmenttask aligned with the subjectintended learning outcome?The verb in the subjectintended learningoutcome provides theclue on the kinds ofstudent activities in theassessment task.6/5/2013Preparedby:SBSatorre33
  • 34. 3. Create the scoring orgrading criteria.6/5/2013Preparedby:SBSatorre34
  • 35. Methods of Grading SILOs1. Direct Grading2. Indirect Grading6/5/2013Preparedby:SBSatorre35
  • 36. Direct GradingGrading theoverall SILOsGrading Criteria(using Rubrics) ofIndividual SILODerive FinalGrade6/5/2013Preparedby:SBSatorre36
  • 37. Example: DBSys31 SILOs• SILO 1 – contrast traditional file-based systemsand database system in terms of efficiency on datamanipulation, information access and security• SILO 2 – explain the different data models as basisfor designing an information system.• SILO 3 –apply a relational database model todesign the database for a particular informationsystem• SILO 4 – design a normalized database for theintended information system• SILO 5 – construct the appropriate SQLstatements to solve SQL query problems6/5/2013Preparedby:SBSatorre37
  • 38. UC Grading SystemGrade Equivalent1.0 100% - 95%1.1 – 1.5 94% - 90%1.6 – 2.5 89% - 80%2.6 – 3.0 79% - 75%5.0 74% - 65%NC No CreditNG No GradeDR DroppedW Withdrawn6/5/2013Preparedby:SBSatorre38
  • 39. • SILO 1 – contrast traditional file-based systems and databasesystem in terms of efficiency ondata manipulation, informationaccess and security• SILO 2 – explain the different datamodels as basis for designing aninformation system.• SILO 3 –apply a relationaldatabase model to design thedatabase for a particularinformation system2.6 – 3. 079% - 75%6/5/2013Preparedby:SBSatorre39
  • 40. • SILO 1 – contrast traditional file-based systems and databasesystem in terms of efficiency ondata manipulation, informationaccess and security• SILO 2 – explain the different datamodels as basis for designing aninformation system.• SILO 3 –apply a relationaldatabase model to design thedatabase for a particularinformation system• SILO 4 – design a normalizeddatabase for the intendedinformation system1. 6 – 2. 589% - 80%6/5/2013Preparedby:SBSatorre40
  • 41. • SILO 1 – contrast traditional file-based systems and database systemin terms of efficiency on datamanipulation, information access andsecurity• SILO 2 – explain the different datamodels as basis for designing aninformation system.• SILO 3 –apply a relational databasemodel to design the database for aparticular information system• SILO 4 – design a normalizeddatabase for the intended informationsystem• SILO 5 – construct the appropriateSQL statements to solve SQL queryproblems1.1 – 1.594% - 90%1.0100% - 95%6/5/2013Preparedby:SBSatorre41
  • 42. Indirect GradingGrading theAssessment Taskswhich are alignedwith the SILOsGrading Criteria of(using Rubrics)individualassessment taskDerive Final Grade6/5/2013Preparedby:SBSatorre42
  • 43. Using Rubrics• A rubric is a scoring tool that lays out thespecific expectations for a performance task.• Rubrics divide a performance tasks into itscomponent parts and provide a detaileddescription of what constitutes acceptable andunacceptable levels of performance for each ofthose parts.• Rubric can be use for grading a large variety oftasks: discussion participation, laboratoryreports, portfolio, group work, oralpresentation, role play and more (Stevens andLevi, 2005).6/5/2013Preparedby:SBSatorre43
  • 44. 2 Vital Components of aRubric1. Criteria2. Scale – descries how well or poorly anygiven task has been performed (ex: VeryGood, Good, Fair, Needs Improvement)6/5/2013Preparedby:SBSatorre44
  • 45. Rubric Title:AssessmentTask:SILO:Scale Level1Scale Level2Scale Level3ScoreCriterion 1Criterion 2Criterion 3Criterion 4Feedback:6/5/2013Preparedby:SBSatorre45
  • 46. Workshop # 3 – Designing AssessmentTasks (ATs)1. Design Assessment Tasks for your CILO # 1.2. Present your design in the form of a table below.CILO Assessment Tasks Student Activities incompleting the ATs6/5/2013Preparedby:SBSatorre46
  • 47. References:• http://www.aaia.org.uk/pdf/Publications/AAIA%20Pupils%20Learning%20from%20Teachers%20Responses.pdf• http://www.aaia.org.uk/pdf/Publications/AAIAformat4.pdf• http://www.aaia.org.uk/pdf/asst_learning_practice.pdf• http://community.tes.co.uk/forums/t/300200.aspx• http://www.schoolhistory.co.uk/forum/lofiversion/index.php/t7669.html• www.harford.edu/irc/assessment/FormativeAssessmentActivities.doc• Paul Black et al, Assessment for Learning, (Open UniversityPress, Maidenhead, 2003)• Paul Black et al, ―Working inside the black box‖, (nferNelson, London, 2002)• Paul Black and Dylan William, Inside the BlackBox, (nferNelson, London, 1998)• Assessment Reform Group, Testing, Motivation and Learning, (TheAssessment Reform Group, Cambridge, 2002)• Assessment Reform Group, Assessment for Learning, (The AssessmentReform Group, Cambridge, 1999)• Angelo, TA, KP Cross. Classroom Asessment Techniques: A Handbook forCollege Teachers. Jossey-Bass Publishers: San Francisco. 1993.6/5/2013Preparedby:SBSatorre47
  • 48. • Southern Illinois University : Several CATs online:http://www.siue.edu/~deder/assess/catmain.html• Bresciani, M.J. (September, 2002). The relationship between outcomes,measurement. and decisions for continuous improvement. NationalAssociation for Student Personnel Administrators, Inc NetResults E-Zine.http://www.naspa.org/netresults/index.cfm• Bresciani, M.J., Zelna, C.L., and Anderson, J.A. (2004). Techniques forAssessing Student Learning and Development in Academic and StudentSupport Services. Washington D.C.:NASPA.• Ewell, P. T. (2003). Specific Roles of Assessment within this Larger Vision.Presentation given at the Assessment Institute at IUPUI. Indiana University-Purdue University- Indianapolis.• Maki, P. (2001). Program review assessment. Presentation to the Committeeon Undergraduate Academic Review at NC State University.• Bresciani, MJ.(2006). Outcomes-Based Undergraduate Academic ProgramReview: A Compilation of Institutional Good Practices. Sterling, VA: StylusPublishing.• Bresciani, M. J., Gardner, M. M., & Hickmott, J. (In Press). Demonstratingstudent success in student affairs. Sterling, VA: Stylus Publishing.• NC State University, Undergraduate Academic Program Review. (2001)Common Language for Assessment. Taken from the World Wide WebSeptember 13, 2003:http://www.ncsu.edu/provost/academic_programs/uapr/process/language.html• Palomba, C.A. and Banta, T.W. (1999). Assessment essentials: Planning,implementing and improving assessment in Higher Education. San Francisco:Jossey-Bass.• University of Victoria, Counseling Services. (2003) Learning Skills Program:Blooms Taxonomy. Taken from the World Wide Web September 13, 2003:http://www.Coun.uvic.ca/learn/program/hndouts/bloom.html6/5/2013Preparedby:SBSatorre48
  • 49. • Anderson, L., & Krathwohl, D. (2001). A taxonomy for learning, teaching andassessing: A revision of Bloom’s taxonomy of educational objectives. New York:Longman.• Biggs, J. (2003). Teaching for quality learning at university (2nd ed.). Buckingham:Open University Press/Society for Research into Higher Education.• Bloom, B. (1956). Taxonomy of educational objectives: The classification ofeducational goals. In B. S. Bloom (Ed.) Susan Fauer Company, Inc. , pp. 201-207.• Jackson, N, Wisdom J and Shaw M, (2003). Using learning outcomes to design acourse and assess learning. The Generic Centre: Guide for Busy Academics. York:Higher Education Academy Available athttp://www.heacademy.ac.uk/assets/York/documents/resources/resourcedatabase/id252_Guide_for_Busy_%20• Academics_Using_Learning_Outcomes_to_Design.rtf (accessed 6 September2008).• Krauss, K. L. (2005). Engaged, inert or otherwise occupied: Understanding andpromoting stduent engagement in uinversity learning communities. Paper presentedat the Sharing Scholarship in Learning and Teaching: Engaging Students, JamesCook University. Available at http://www.cshe.unimelb.edu.au/pdfs/Stud_eng.pdf(accessed 6 September 2008).• Ramsden, P. (2003). Learning to Teach in Higher Education. London, UK: KoganPage. St Edward’s University, Centre for Teaching Excellence (2004). Task-orientedquestion construction wheel, based on Bloom’s taxonomy. Available athttp://www.stedwards.edu/cte/files/BloomPolygon.pdf (accessed 6 September2008).• Task-Oriented Question Construction Wheel based on Bloom’s Taxonomy © 2004St Edward’s University Centre for Teaching Excellence6/5/2013Preparedby:SBSatorre49
  • 50. Thank you   6/5/2013Preparedby:SBSatorre50