Assessment in HE: Beginning teach in HE


Published on

My slides for my beginning to teach in HE sessions on assessment, at DMU on 18/10/13

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Job prospects – loss of earnings for lower classification of degree (THE article 30 May)
  • Safeguarding standards – National Framework qualifications
  • We are doing this the wrong way around by showing you the outcome first – we should start by thinking about the content, skills, attributes to be tested
    The candidates share at least one characteristic (gender) but not others (ethnicity, age, current and previous experience) Do these affect the result?
    Clip = 3 mins Oscar Peterson comes in after 1min 30sec
  • Assessment criteria deter assessment from being impressionistic and those assessing being swayed by external factors (reputation, previous performance (good or bad)
  • Don’t forget that each assessment task should link back to the module learning outcomes
  • discussion
  • Discussion, Q&A
  • Assessment in HE: Beginning teach in HE

    1. 1. Assessment in HE Beginning to teach in HE Professor Richard Hall @hallymk1
    2. 2. What does *assessment* make you think about?
    3. 3. In this session we will consider • • • • The importance of assessment The reasons for assessing students Your experiences of assessment The big issues and possible future trends in assessment • Some key principles of assessment o Assessment criteria o Marking students’ work o Providing feedback
    4. 4. Assessment is important “…students can, with difficulty, escape from the effects of poor teaching, they cannot (by definition, if they want to pass or graduate etc.) escape the effects of poor assessment.” Boud (1995:35)
    5. 5. Assessment is important • it affects people’s lives • it cements/challenges students’ views of themselves • It frames students’ views of HE • is a major concern and burden for those teaching them
    6. 6. it cements and challenges students’ views of themselves
    7. 7. Why do we assess students?
    8. 8. Why do we assess students? To support their learning To safeguard standards To enable them to progress within their degree To provide data for classification To provide data for other “end users” such as the public, employers
    9. 9. DBIS on academic governance and assessment 2.2.48 There were also concerns about the perceived decoupling of teaching and assessment through the awarding of DAPs to non-teaching bodies. Many respondents felt that this would weaken the crucial link between teaching and research, to the detriment of the student experience. However, others welcomed the proposal to award DAPs to non-teaching bodies, which they felt would increase choice for colleges requiring validation and remove a longstanding anomaly from the system. DBIS. 2012. Government response to ‘Students at the heart of the system’.
    10. 10. DBIS on academic governance and assessment Criterion B2 of the technical consultation. The applicant organisation will be required to provide evidence that: the regulatory framework governing its higher education provision (covering, for example, student admissions, progress, assessment, appeals and complaints) is appropriate to its current status and is implemented fully and consistently; and it has in prospect a regulatory framework appropriate for the granting of its own higher education awards. DBIS. 2012. Government response to ‘A new regulatory framework for the HE sector’.
    11. 11. National Framework of Qualifications
    12. 12.
    13. 13.
    14. 14. What does assessment do? Assessment  directs attention to what is important;  acts as an incentive for study;  has a powerful effect on what students do and how they do it;  communicates to students what they have and have not succeeded in doing. For some, assessment builds their confidence, for others it undermines it
    15. 15. What does assessment do? Boud (1995) argues that Assessment tends • to focus on demonstrating current knowledge but • focuses little on the processes of learning and on how students will learn after the assessment point • so in that way HE may be accused of failing to prepare students for the rest of their lives. Do you agree?
    16. 16. Your experiences of assessment Think of a positive and a negative assessment experience. 1. What were the factors that made it un/successful? 2. On reflection were there aspects that could have made it even better?
    17. 17. Assessment in practice: constructive alignment The form of any assessment task should be about the last thing we think about. Biggs (1999) talked about “constructive alignment”
    18. 18. Learning outcomes Learning outcomes Teaching and Teaching and learning learning Assessment task Assessment task Feedback Feedback What do we want students to learn? At various levels: programme, module How do we enable/assist students to learn the things we have agreed upon? How do we know whether they have learnt these things? How do we help them improve on their performance?
    19. 19. Case study DMU Assessment pages: DMU Generic Mark Descriptors: DMU Guidance for Staff:
    20. 20. case study: “Keith and Oscar” In this example we are starting at the end of the process… There are two high calibre candidates, both have the same task, both have the same equipment and support to help them, both are undertaking the task in the same “test” conditions. The candidates share one characteristic (gender) but not others (ethnicity, age, current and previous experience) So how would you assess them? Which is the better? v=ZvQIobg0BwU&feature=related
    21. 21. Assessment case study Unpacking the assessment process: • Creating assessment criteria and assessment tool; • Marking the students; • Providing feedback.
    22. 22. Creating an assessment tool In groups decide on the following:  What are the skills, attributes, content etc that are to be tested in this assessed task? Are they weighted?  What are the broad categories you intend to assess? (this indicates what you place value on) Is this task  Criterion referenced (given standards against which each student is individually judged)  Norm referenced (students judged against the performance of their peers)
    23. 23. Marking: Design an assessment tool You will use the DMU generic mark descriptors to assign a mark. Biggs argues that, for transparency, we mark against only the agreed criteria (content, skills and attributes) Decide whether the assessment is to be: Formative (to help the learner to do it better next time) or Summative (to provide the student and the institution with a grade/mark in order to indicate the level of attainment) This affects marking behaviour
    24. 24. Providing feedback Remembering the formative/summative nature of the assessed task, think about what feedback you would give to each learner To show how well they have done (a constructive balance of +ve and –ve) Indicate how their performance compare to others (need to link to mark awarded) Help them to improve (a mixture of general and very specific advice, but with no guarantees?)
    25. 25. Assessment case study v=ZvQIobg0BwU&feature=related Using the current DMU generic mark descriptors, what mark would you give? What feedback would you provide to the student?
    26. 26. big issues in assessment Workload • Reduce burden on staff whilst ensuring student has opportunity to meet all Learning Outcomes • How to maintain good levels of student support? Feedback • Need to maximise opportunities for students to receive and make good use of constructive feedback Diversity • How to fairly assess a diversifying student population? Plagiarism • How to “design out” opportunities for students to plagiarise in coursework assessments?
    27. 27. 1. Variations in practice 2. Audit/monitoring/scale versus variation 3. Impact of the academic calendar on sequencing, feedback and forward 4. Space and time for development and innovation 5. Work-based learning and assessment 6. <me> the role of for-profits <me> “opportunities for students to engage with assessment design and the process of making academic judgements appears to be limited at present” JISC. 2012. A View of the Assessment and Feedback Landscape.
    28. 28. Future trends Written exam Course work Tutor-led Student-led Implicit criteria Explicit criteria Competition Collaboration Product Process Objectives Outcomes “ability to…” Course Module & programme A’ Levels APL Brown et al (1997)
    29. 29. Future trends Do changes in the student body mean changes in assessment? • Different A’ Level experience (more contact, modular, multiple opportunities) • More distance learning? • More employment related “training” More e-learning? • DMU e-assessment project
    30. 30. Future trends Greater pressures for quicker turnaround times combined with greater demand for advice on how to improve (consumerism, via £9k fees) • Changes the work we set to be assessed? • Reduces experimentation (limit personal liability against litigation?) • Increases caution • Changes the substance of the feedback we provide and the way in which we provide it
    31. 31. • • • • • • for learning or knowing Complexity and increasing uncertainty in the world demands resilience Integrated and social, rather than a subject-driven Engaging with uncertainty through projects that involve diverse voices in civil action Discourses of power – co-governance; co-production? Authentic partnerships, mentoring and enquiry, in method, context, interpretation and action How does our assessment experience inform resilience and our work at scale?
    32. 32. Stuff to read Biggs, J. (1999) Teaching for Quality learning at University. Buckingham: SRHE/OUP. Boud, D. (1995) Assessment and Learning: Contradictory or Complementary? In P.Knight (ed) Assessment for Learning in Higher Education. London: Kogan Page. Brown, G. et al (1997) Assessing student learning in higher education. London: Routledge. CELT Hub: CELT on anonymous marking: JISC assessment projects: