From Jisc's student experience experts group meeting in Birmingham on 21 April 2016.
https://www.jisc.ac.uk/events/student-experience-experts-group-meeting-20-apr-2016
3. Assessment and feedback challenges
› Highly devolved responsibility and
inconsistent practices
› Lack of developmental focus
› Traditional practices dominate
› Timeliness, quality and consistency
of feedback
› Learner in passive role
› Lack of relevance to world of work
Assessment and feedback (2012)
» Nuanced responses but broad trends
visible
» Localised initiatives but beginning to
scale up
» System integration is a key problem
area
» EMA exposes variability in business
processes
EMA (2014)
20/04/2016
5. Developing solutions through ‘co-design’
» Workshops in Dec 2014 and Jan 2015
explored possible solutions to these
prioritised challenges
» 1st workshop – resulted in 30 solution
ideas
» Ideas were synthesised down into 5
main areas
» 2nd workshop - worked these up into 3
detailed project specifications
Moving to solutions
10/09/2015 Moving from electronic management of assessment challenges to solutions
ema.jiscinvolve.org/
6. Developing solutions
1. Addressing the variation of business
processes and systems not meeting
all needs (1,7, 12, 14)
2. Assessment literacies, assessment
design, risk aversion (4, 6, 8, 9, 11, 15)
3. EMA systems integration (3)
4. Lack of a longitudinal/holistic view of
student feedback; student
engagement with feedback (10, 5)
5. Addressing issues with reliable
submission (2)
Challenge
1. Specifying the range of possible
marking and moderation processes and
surfacing how systems support them
2. Online toolkit sharing effective practice
3. IMS GlobalConsortium Assignment
Task Force
4. Researching where Jisc can add value in
the ‘feedback hub’ space
5. (Suppliers addressing these issues)
Solution
20/04/2016
7. Where are we now
»Now available
› New online guide Transforming assessment and feedback with technology
www.jisc.ac.uk/guides/transforming-assessment-and-feedback
› New accompanying online guide on EMA processes and system specification,
including supplier responses to requirements
http://bit.ly/EMA-processes-guide
»Coming soon
› A self-assessment tool linking through to the guidance, by
institution/departments/faculty
– Due May 2016
› Findings from the FE and skills e-assessment survey due May 2016
› Guidance and case studies for FE and skills, due June 2016
20/04/2016
8. Embedded in the resources
EMA case studies and examples
» Queen’s University, Belfast
» Institute of Education (now part of UCL)
» University of Hertfordshire
» Keele University
» Manchester Metropolitan University
» Bedford College
» Walsall College
» Sheffield University
» Podcasts
› Benefits of EMA
› Managing an EMA project
› Assessment and feedback lifecycle
› Processes and systems
› Value of Jisc resources
» Supplier responses to EMA
requirements
› Summary view across all
systems and individual supplier
responses
22/04/2016 Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all) 8
» Videos
» Employability and assessment
» e-Portfolios
» Peer assessment
» Reconceptualising feedback
9. What you said…
»Using the resources for:
› Source for discussion for basic tutor
training
› Academic unit away day sessions
› Institution-wide enhancement projects
› Articulating benefits
to senior managers
»What’s missing:
› Online exams / tests
20/04/2016
The straightforward
framework from MMU has
enabled us to start
programme level
discussions
The content is pitched at the
right level to be useful to any
academic/technologist
facilitating or supporting
assessment.
10. Walkthrough
As a…. I want to…. So that….
Lecturer / teacher Explore how I might develop
more effective strategies for
engaging my students with the
feedback they receive
I can ensure they learning from
the opportunities that feedback
provides
Learning technologist / EMA
lead
Identify which educational
systems could best support my
departmental processes around
online submission, marking and
feedback
Ensure the system/s selected
best meet our needs
Project manager Explore how to best manage a
move to online submission,
marking and feedback in my
department
We can plan the project to
ensure maximum take-up by
staff and students
20/04/2016
12. Find out more
20/04/2016
» Landscapereportavailablefrom:
bit.ly/jisc-ema
» Jointheconversationontheblog:
ema.jiscinvolve.org/
» and ontwitter#jiscassess
» Jointhemailinglist:
jiscmail.ac.uk/tech-enhanced-assessment
13. jisc.ac.uk
Except where otherwise noted, this work
is licensed under CC-BY-NC-ND
Find out more…
Contact
Lisa Gray
Senior Co-Design Manager,
Student Experience
lisa.gray@jisc.ac.uk
20/04/2016
Editor's Notes
2014- the benefits and drivers were becoming clear, but a number of issues were surfacing through a number of forums including the HeLF sig survey work and sig meetings. So earlier this year, the project started, in collaboration with the Heads of e-Learning Forum and UCISA. We aim to maximise the benefits technology can offer in the management of assessment, through:
Refining our understanding the end to end assessment and feedback lifecycle across UK HE (and FE),
refining our understanding of how technology does and could support this lifecycle,
identifying and sharing approaches to adoption (including what is working well and what isn’t), and
provide guidance on effective practice in this space.
In 2012 a baseline review undertaken by 8 institutions showed:
In terms of strategy and policy, a key issue is that although there maybe an overall assessment strategy, responsibility for implementing it is devolved to departments/faculties. This results in considerable variation in assessment and feedback practices, making it difficult to achieve parity of experience for learners.
Another key issue is that strategy documents tend to be quite procedural in focus and don’t reflect current thinking around effective assessment practice and the value that assessment can bring to learning.
When it comes to academic practice the issues are varied and complex but include the emphasis on summative assessment and the persistence of traditional forms such as essays/exams.
Timeliness, along with quality and consistency of feedback, was an issue across the board. Even where clear deadlines are set there isn’t always in time to feed into next assignment. Curriculum design (modular approach) can also provide barriers to the ongoing developmental approach to feedback at a programme level.
There is a perception that learners don’t engage with the feedback they receive. Tutors may feel they have given a lot of feedback and support but it hasn’t been acted upon. Learners are seen as passive – waiting for feedback to be delivered to them but the reality is less clear cut as the value of acting on feedback is not always well-communicated, and was notably absent in most induction processes. Learning design often puts the learner in a passive role.
And finally, the assessment and feedback process, particularly the emphasis on high-stakes assessment and the value that is placed on marks and grades, is very different to the formative ways professionals develop during their working life, where much value is gained from feedback from for examples peers.
EMA
There was a recognition that a number of localised initiatives were starting to scale up, and it was clear that institutions are increasingly moving beyond experimentation and are looking to take a more strategic approach to the application of EMA . Although that may not be instit wide. There are some egs of policies in development in the report.
The challenges were evident across all three areas of technology, process and practice – it won’t come as a surprise that integration of systems was a key problem area, as was the variation in business processes unearthed when technology solutions were put into play, as well as challenges with assessment and feedback practice which I’ve already touched on.
Through consultation we prioritised the challenges identified through the EMA landscape study - here you can see a top 20 listed around the assessment and feedback lifecycle.
At the start of the research there was a general feeling that stages 2-4 are better understood and less problematic than some of the other components of the life-cycle, not least because many institutions are managing all of the related information within a single VLE system. Stages 5-8 were felt to be where we begin to open Pandora's box ... This proved to be borne out by the findings of the research.
You can see the clustering around the later stages of the lifecycle, particularly around marking and production of feedback and recording grades. There are also a lot of issues that together group around the theme of student assessment literacies.
Marking and production of feedback appears to be the most problematic component of the life-cycle as it is the area where variety of pedagogic practice results in a situation where the fit between institutional processes and the functionality of commercially available systems is least well matched. We heard a very clear message from universities that existing systems do not adequately meet UK requirements in these areas. A basic issue is that marks and feedback are different things and need to be handled differently whereas technology platforms tend to conflate the two. It was also observed that systems seem too often to be predicated on an assumption that 1 student = 1 assignment = 1 mark. This model may usually be adequate for formative assessment but does not meet UK requirements for summative assessment processes. Systems would ideally offer a range of different workflows based on different roles e.g. first marker, second marker, moderator, external examiner etc.
So, by September last year we had reviewed the state of play, and through a number of opportunities had prioritised the challenges.
The next phase was to as part of a ‘co-design’ process, collaboratively work up solution ideas to these challenges. To this end, we ran, in collaboration with a service design group called Livework, two participatory workshops in December and January this year.
Around 30 representatives from 30 different institutions
During our workshops over the winter participants from the sector came up with 30 solution ideas.
These fell into 5 clear groups – helpfully addressing the top few priorities.
These 5 areas of work were:
Around the variation of processes around marking and moderation in particular
The need to share examples of effective assessment design, further develop staff and student assessment literacies
Issues with the integration of systems, particularly around not duplicating the entry of marks, grades and feedback
A lack of a holistic view of student feedback for both staff and students – often only at module level
The reliability of systems at key submission points – a number of examples of systems falling down at critical times
Report launched in July last year.
We also have a range of additional resources on wider topics with assessment and feedback, including four themed guides, case studies and videos on employability and assessment; feedback and feedforward; and change strategies.
For more examples, see the Jisc Design Studio for all projects and themes.