AAGLO Forum May 2012v2
Upcoming SlideShare
Loading in...5

AAGLO Forum May 2012v2






Total Views
Views on SlideShare
Embed Views



0 Embeds 0

No embeds



Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment

    AAGLO Forum May 2012v2 AAGLO Forum May 2012v2 Presentation Transcript

    • Assessing and AssuringGraduate Learning Outcomes National Fora Brisbane Melbourne Sydney Adelaide Perth audio link to this presentation
    • AAGLO National Fora 2012 WELCOME●Forum program ● 9:15 Registration ● 9:30 Opening and brief overview of AAGLO Project ● 9:45 Keynote address Professor Trudy W. Banta ● 10:45 Questions ● 11:10 Morning tea ● 11:30 Presentation of AAGLO interview findings ● 12:00 Workshop Response to issues raised (15 minutes - Trudy Banta) ● 1:30pm - 2:15pm Lunch
    • Forum ObjectivesFor participants to engage with colleagues in:● discussion of practice and issues in the assessment and assurance of graduate learning outcomes in the Australian higher education context● developing informed opinion to contribute to institutional decision- making at various levels● forming collaborations for further investigation and innovation in this area. ●
    • The AAGLO PROJECT● Funded in 2010 under the ALTC Strategic Priority Project Scheme to investigate ● The types of assessment tasks most likely to provide convincing evidence of student achievement of or progress towards graduate learning outcomes? and, ● The processes that best assure the quality of assessment of graduate learning outcomes.●
    • ● Project team: ● Simon Barrie (The University of Sydney) ● Clair Hughes (The University of Queensland) ● Geoffrey Crisp (RMIT) ● Anne Bennison – Project Manager (The University of Queensland)● Timeline: Jan 2011 – August 2012● International reference group● Broad in scope and range of activities Project website http://www.itl.usyd.edu.au/projects/aaglo/
    • Project activities and outcomes to date Activities OutcomesSituational analysis “Related projects” identified and documented, communication with project and institutional leadersLiterature review Summary papers 1: The ALTC AAGLO project and the international standards agendaConsultation with 2: Assurance of graduate learning outcomes through external reviewreference group 3: Challenges of assessing Graduate Learning Outcomes (GLOs) in work-based contextsVisits to international 4: Standardised testing of graduate Learning Outcomes in Higher Educationcentres of excellence 5: Approaches to the assurance of assessment qualityConference roundtables 6. Assessment policy issues in the effective assessment and assurance of GLOs Endnote libraryParticipation in national Response to government discussion paper on thedebates Assessment of Generic Skills Co-authorship of “Mapping learning and teaching standards in Australian Higher education: An issues and options paper”Interviews Findings
    • KeynoteTrudy Banta - pioneer in outcomesassessment● Professor in Higher Education● Senior Advisor to the Chancellor for Academic Planning and Evaluation at Indiana University - Purdue University (IUPUI)● founding editor of “Assessment Update”● numerous publications on outcomes assessment.http://www.planning.iupui.edu/103.html
    • Questions
    • AAGLO INTERVIEW FINDINGS audio link to discussion of project findings
    • AAGLO Interviews● Ethical approval● Telephone interviews● Participants selected through LTAS project and in consultation with LTAS scholars● 84 invitations to academics across 7 disciplines (Accounting/Business: Chemistry: Drama and performance: Engineering: History: Law: Veterinary Science) representing LTAS demonstration clusters and range of university types and locations throughout Australia● 48 interviews conducted of approximately one hour (2 partial)● broad coverage of assessment and assurance practice and issues● Nvivo software for analysis and storage of data.
    • The disciplines we selected were ...
    • We interviewed ......● 30 male and 18 female academics● academics from 26 institutions● 15 Deans /Associate Deans● 12 with program-level responsibilities● 36 with single course responsibilities● 41 who taught in one or more courses● 17 involved in disciplinary initiatives around assessment and standards such as LTAS project● 10 involved in other national projects● 3 LTAS Discipline Scholars● 4 Quality Verification System (QVS) and 2 other external reviewers● 4 past or current members of disciplinary accreditation panels● several academics who had published in this area
    • Course levels● L1 - 13● L2 - 6● L3 - 14● L4 and above - 7● Masters - 5
    • Key assessment tasks by discipline Drama History Law Vet TOTAL Busi Chemi Engine Scienc ness stry ering eCritical review or 2 0 1 1 5 4 1 14essayExaminations 0 0 0 0 0 1 4 5Oral presentation 5 1 1 3 0 1 0 11Performance 0 0 5 0 0 0 0 5Reflective piece 1 0 1 2 0 3 0 7Report 6 6 1 6 0 1 1 21Tutorial and 1 2 3 1 0 3 1 11rehearsal activitiesWork placement 1 0 0 0 0 2 0 3Working 0 0 0 3 0 0 0 3demonstrationOther 0 0 1 2 1 2 3 11Multicomponent 5 0 1 5 1 4 1 17tasks
    • GLOs assessed using nominated tasks
    • Other task featuresTask relationship patterns within a course Cumulative (a series of related tasks combined as a single product) – 9 Linked – 15 (successful completions of a task indicated likelihood of success in following tasks) Repetitive -3 (same task repeated several times to develop expertise) Independent -16 (different tasks assessed different components of a course)Active student role
    • Effective task characteristics● Multiple, related stages● Aligned with course learning objectives – incorporation of TLOs such as self-organisation, management, lifelong learning: reflect on social, cultural and ethical issues: apply local and international perspectives; plan ongoing personal and professional development.● Blurred distinction between learning and assessment activities● Activities and text types characteristic of profession● Authentic contexts, roles and audiences ● 12 real-life ● 25 lifelike (definitional range)● Careful group task design, management and grading● Active role that developed student capacity for self- assessment and self-directed learning
    • Task quality assurance practicePre-implementation Post-implementation● Assessment policy ● Formal evaluation processes (24)● Other related policy (e.g. Quality incorporating: Assurance) ● review of student satisfaction surveys● Mapping of program curriculum inputs ● monitoring by boards of examiners or (25) or program assessment (5) other committees ● audits and reviews● Formal approval processes for new ● documentation and reporting of and revised assessment by variously responsive action by course and titled committees program coordinators and sometimes ● Course level (3) individual teaching staff. ● Program level (14) ● Student representation on faculty TL ● Faculty or school level (26) Committees (6) ● Institutional level (8) ● Response to student complaints (1) ● Multiple level (15) ● Informal only (6)● Some approval for examinations only ●● Informal only (5) ●
    • Assuring task quality● Approval from a whole-of-program perspective● Approval for significant change as well as for new assessment tasks● Effort spent prior to implementation to save effort after implementation● Where multiple approval is required at least one level provides feedback beyond policy compliance● Consequential review and evaluation procedures – action required and reported● Institutional data collection and reporting support the evaluation process● Inclusive – all have some level of responsibility for assessment quality●
    • The basis of judgements● Course LOs based on institutional graduate attributes (28), personal experience (17) and accreditation requirements (12)● Common practice to provide criteria with marks, criteria and standards rubrics or marking guidelines● Links between wording of course LOs and assessment criteria often unclear●
    • Assuring standards Post-judgement –Pre-judgement - (consensus moderation)(calibration)Examples ● No moderation rare and usually if only single marker● Workshop for staff to induct them into the standard ● Moderation could be informal. expected for the award of different grades ● The teams marking the assignment often sit in the same● Project work is required at each level of the program room to mark they don’t have to with about 70 academics involved in the assessment process. As part of their induction they but normally do so as this is are provided with a training session during which another opportunity for informal everyone marks particular group reports from moderation. previous years and displays their mark on yellow paper on the reports around the room to enable ● Consensus moderation most them to compare their standards with those of common approach (85 comments), e. others g.● Preliminary marking of selected papers, discussion ● discussion to reach agreement of the application of criteria and standards prior to ● double marking marking of remainder of papers ● random checks by coordinator● Much marking is undertaken by sessional staff. ● Some instances (5) of normal distribution requirement with rescaling They are gathered together andR. (2012). Assuring comparability of achievement standards in higher Sadler, D. the criteria are explained. The unit team pick out a small numbermoderation to calibration. Manuscript submitted for education: From consensus of of ‘outliers’ or justification required publication. assignments randomly to mark and discuss. After
    • Assuring judgement quality● Shared standards at program and course level● Effort spent to establish standards prior to judgements to save effort after judgements have been made● Criteria and standards basis for both assessment judgements and moderation● Inclusive – all have some level of responsibility for assessment judgements including casual staff● Resourcing to support effective calibration and moderation processes – rescaling cheaper but less effective as professional development
    • Recording student GLO progress through a program● Few examples of progressive recording of student GLO development● Most common was aggregation of course grades in summary numerical forms such as those required for progressive GPA calculation● Some year level (horizontal) approaches● Mapping of inputs on assumption that coverage of GLOs in combination with aligned assessment a logical proxy measure of progress. Challenged in institutions with standardised grade cut-offs such as 50% “Pass” grades.● 3 reports of informal approaches with small student cohorts (e.g. team meetings)● Reservations about ePortfolios effectiveness as practice inconsistent● Most reported monitoring student progress as a current priority – wait and see attitude to possible TEQSA requirements ●
    • Quality improvementExamples● Nomination of task role and audience for report after participation in ‘Achievement Matters’ project● Lecturer feedback more challenging after discussion and observation of feedback provided by colleagues● Tutor provision of annotated samples of work to students to facilitate understanding of criteria and standardsAll example attributed to quality assurance processes that encouraged and facilitated dialogue with colleagues