Your SlideShare is downloading. ×
0
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Assessment and Feedback programme update (April 2012)
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Assessment and Feedback programme update (April 2012)

338

Published on

Update on the A&F programme at the JISC Learning and Teaching Experts mtg April 2012

Update on the A&F programme at the JISC Learning and Teaching Experts mtg April 2012

Published in: Education, Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
338
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Assessment, as we all know, lies at the heart of the heart of the learning experience. Over the last couple of years JISC recognised the importance of this area in it being a key driver to change in institutions, as well the enormous potential technology has to transform assessment practices, proven by previous programmes of work including the Curriculum Design and Curriculum Delivery programmes which have recently completed. We have recently launched a new guide Effective Assessment in a Digital Age, which showcases some of the excellent work that has gone before, as well as run a number of national workshops taking this work into institutions. This programme builds on all of this past work, aiming to take us forward in new ways. Strand A: Institutional change. These projects are redesigning assessment and feedback processes and practices, making best use of technology to deliver significant change at programme, school or institutional level.(15 months – 2 years funded, £100-£200k per project)Strand B: Evidence and evaluation. You’ll hear more about these projects today in the next session. These projects are evaluating assessment-and feedback-related innovations which are already underway in a faculty or institution, and report on lessons for the sector.(6 months – 2 years, £20k per project) Strand C: Technology transfer. These projects are packageing up technology innovations in assessment and feedback for re-use (with associated processes and practice), and supporting its transfer to two or more named external institutions.(9 months – 2 years, £40k per project)Support projectA Support and Synthesis project led by JISC infoNet and in partnership with JISC-CETIS is co-ordinating a support and synthesis team to provide support to projects across all strands through a range of mechanisms including webinars,access to sector expertise and peer support through CAMEL networks.
  • There are 32 institutions involved in the programme, based all over the UK from Dundee to Cornwall, Belfast to Hertfordshire. Strand ABath Spa UniversityWinchester UniversityCornwall CollegeHull CollegeUniversity of Dundee x2University of Exeter x2University of Hertfordshire x2Institute of EducationManchester Metropolitan Queens University BelfastStrand BUniversity of Edinburgh x2University of GlamorganUniversity of HuddersfieldThe Open University x2University of WestminsterUniversity of Greenwich, City University London, UWICUniversity of Reading, University of Bedfordshire x2University of ManchesterUniversity of Southampton x2Kings College LondonKingston UniHarper Adams University College x2University of Strathclyde x2University of Glasgow x2University of NottinghamDeMontfortUnviersityWest of ScotlandEast AngliaOxford Uni
  • These aims are reflected in the diagram which represents the goals of the 8 institutional change projects. Some will be addressing one of these but all are related....At the heart is the aim of enhancing learning and teaching practice, through assessment for learning - meeting students’ needs for well-timed, meaningful, authentic assessment experiences, backed up by feedback that increases their understanding of the topic. Better understanding of the criteria, goals and standards before an assignment is also involved, hence a focus on feed forward in some project outlines. Alongside the development of more integrated strategies, policies and processes, through technology acting as a catalyst for change across institutional strategies and processes, with the associated potential overhaul of design, review and approval processes. At the very least, e-learning strategies should be built into the institutions assessment strategy. One project has in its main aims to develop that institutional vision for assessment.Increased efficiency: Three aspects are focused on – a) Pedagogical gains from more efficient working practices e.g. Technology enhanced feedback, or greater opportunities for regular and frequent testing even in large-group contexts. b) Time saving. E.g through data being reused effectively to inform an institutional continuous quality improvement and from a learner perspective provide them with a better overview of the progress leading to increased self regulation. More productive use of staff and student time is perhaps the most frequently cited efficiency gain from the use of technology (FASTECH, InterACT, iTEAM, e-AFFECT ) but c) institutional quality assurance may also gain.Improved student learning: In line with principles for good assessment and feedback as developed by Nicol and McFarlane, Chickering and Gamson, Gibbs and Simpson, the quality of student learning is linked directly to the way assessment, and particularly feedback, are designed. InterACT focuses on the value of engaging students in an active response to feedback: ‘The system shifts responsibility to the student to engage with the feedback in a meaningful way and the outcome measures of time will determine the staff effort invested against measures of satisfaction, quality of reflection, progress through modules and attrition rates.’ Dialogue with self, peers and tutors is seen as a key ingredient and redesigns of feedback often make use of the interactive but asynchronous potential of technology-mediated communication. With greater alignment of assessment designs with employer and student needs (Collaborate, FAST, iTEAM), student satisfaction with assessment and feedback is also seen as likely to improve.
  • From Strand A projects, we’re looking for a range of outputs including a summary of their starting points – discussed more later, as well as end evaluation reports articulating the impact of their work, a range of assets showcasing the evidence of impact, as well as an overview of their achievements and guidance and support for others in doing the same. Strand B projects are focusing on the evaluation and evidence of benefit.Strand C will deliver technical models, open source technologies and code, as well as supporting developer guidance and documentation and a summary of the process of roll out to other institutions.
  • This is just a flavour of the technologies being used to enhance practice and process across the programme.These include general learning technologies, including the VLE, as well as specific assessment and feedback technologies.
  • Projects have identified a range of challenges and potential benefits for technology enhancement, ranging from 2 who have a clear focus on the timeliness and quality feedback, plus ensuring opportunities for reflection on, dialogue about, and engagement with feedback, looking at innovative ‘longitudinal’ ‘feedforward’ systems. Others are looking more at assessment management, and how technology can enhance and make more efficient process of submission, marking etc . Others are exploring design issues such as the timing and sequencing of assignments, with clearer articulation of criteria, goals and standardsAll change projects have been asked to identify and place at the heart of their change projects educational principles around good assessment and feedback practice, e.g. REAP.
  • Based on models that have worked well in previous programmes, the projects (and programme as a whole) are supported by a central team with responsibilities for getting the most out of projects and the resulting achievements of those projects – particularly around ensuring project teams are supported in developing robust evaluation plans for gathering evidence of impact; and on capturing the lessons learnt as the programme develops.
  • Why did we do this? At the programme level: It is a way of validating the entire rationale for the programme – we want to know that there is a job that needs doing so, as far as we are concerned, it is great to identify that there are significant areas that need improvement. We want to know that we are addressing issues that are important to the sector as a whole rather than working in niche areas of limited wider application. We want to know that we aren’t reinventing the wheel and that the projects are building on earlier work. We want to develop a shared understanding about our aims and direction of travel.Note importance of ongoing dialogue with groups such as L&T Experts to continue to achieve these goals over the life of the project.
  • 1st bullet point repeated as it is important for strategic projects like these with major transformational goals to know where they sit in the wider landscape.Requirement to do this work based on experience from many other JISC projects that the exercise is useful.Reports back from A&F projects show that the dialogue itself is useful – in helping define the status quo and the issues, stakeholders are already beginning to be engaged and ready to take ownership of the solutions.And maybe most importantly, it is a way of capturing the state of play at a point in time, and providing a baseline measure from which change against the baseline can be evidenced. Although ongoing processAs a way of promoting discussion around the issues, it is also a means of creating a state of change-readiness.
  • This is just a collection of the different approaches taken to this exercise, and the breadth of evidence captured. These reflect the different areas explored, from strategy and policy as represented in formal documentation and processes, assessment and feedback processes, and practice in terms of course level evaluations, and the voices of staff and students. Projects were given guidance around capturing evidence relating to strategy and policy, process, infrastructure, stakeholders Illustrates richness of evidence base.Individuals and small groups identifiable in many cases hence anonymity of synthesis report.List can act as a source of ideas for anyone reviewing practice in their own institution and seeking to compare their findings with our summary.Some of these artefacts may also be useful to revisit as measures of change e.g. shifts in language used in formal documents may reflect cultural change.
  • These examplewordles of the text of a couple of reports give the flavour of a couple of the most distinctive differences in approaches taken by projects:Some of the baselines show a lot of emphasis on formal/summative assessment and the processes and procedures that surround this.In this examplewords like assignment/submission predominate …Another project noted that the term summative assessment is rarely used in institutional documentation because it tends to be assumed that all assessment is summative unless otherwise stated.
  • Another group of reports reflect institutions that are focusing more on the practice of assessment and feedback, with more of a direct focus on enhancing how learners engage with their feedback. These reflect projects hat the project teams believe to be a little more change-ready. Their baselines show more emphasis on feedback – in this example learning and assessment appear in equal measure.We should not however over-emphasise the differences between the two.Some of the institutions that are concentrating on formal processes are very concerned with the learner experience and achieving parity of learner experience across the institution.Some of the institutions that are basing their work on the premise that developmental feedback is more valuable to the learning experience than summative assessment are nonetheless basing their project ambitions on pockets of good practice (either internally or elsewhere) that are far from being mainstream as yet.
  • We believe the baseline picture we have drawn is a reasonable summary of the state of play across the wider landscape.This rich picture from one of the projects at IOE sums up their view of the terrain. The map has at its heart the student experience province – student experience links to policy with a road as their concerns relate to each other. Teaching staff province less well defined, but strongly linked to student experience, less well linked to policy as staff interpretations vary. External exampiners province is uclear and on the fringes of the mainland.It described the map of the territory as follows: ‘The sector practice includes two provinces which are strongly defined and these are the technological innovation in assessment and feedback and the good practice guidelines which draw on wider research and theorising of assessment for learning. The overall sector practice is less well defined and is represented on the map as a large area of uncertainty and danger in the current climate of retrenchment and funding cuts with high mountain ranges and sea monsters.’If time permits allow a few mins discussion on whether people recognise this picture and agree that good/innovative practice and mainstream practice are still separate territories.
  • Given the time we have today I’m going to share just a few of the headline findings in three areas – strategy and policy, stakeholder engagement and practice, and if we have time I’d like to to consider these findings on your tables to see if they reflect your experiences. 1. Despite the central part played by assessment and feedback in the learning and teaching process, there is little evidence of institutions being strongly directive in terms of their strategy and policy steer in these areas.Institutions appear more likely to have assessment ‘frameworks’ and ‘policies’ than top-level strategies and these are often mainly aimed at guiding the development of strategy and policy at a more devolved (school/faculty/college) level. There are some excellent examples across the sector where institutional strategy puts assessment at the heart of learning but joined-up approaches (e.g. institutional strategy, TEL, A&F, Learning spaces) are by no means commonplace.Formal documents tend to be quite procedural rather than emphasising the developmental aspects of A&F. In this sense they seem to lag behind the thinking that underpins most of the projects.2. The programme is sending a strong message that sound educational principles need to underpin A&F practice. The principles that most projects identify as being relevant to their work e.g. REAP, NUS, Gibbs and Simpson (2004) tend to differ from those that are described in formal documents which are again concerned more with procedure and consistency.3. The fact that ultimate responsibility for A&F practice is highly devolved leads to considerable variation in practice as borne out by testimony from both staff (including external examiners) and students in the reviews. This is a major concern for senior managers who feel that lack of parity in the student experience may be a source of complaint/litigation in future. There are series of related concerns discussed in the report about the ability to support learners’ longitudinal development.
  • Regarding stakeholder engagement, these three themes emerged from the reports:1. Learner engagement is a major feature of many of the projects in the programme but was noticeably absent from the descriptions of baseline practice. There was a similar gap around the role of student induction or training in terms of preparing learners to get the most from assessment and feedback. Engagement with small groups of learners is often seen as adequate rather than widespread engagement (repeated for each cohort) being embedded practice.2. The projects made many interesting observations about A&F in relation to employability.The difference between the institutional emphasis on summative assessment and the more formative ways in which professionals develop throughout their careers (including through extensive use of peer review) was noted.One project went so far as to challenge received wisdom that greater definition and clarity in relation to marks and grading is necessarily a good thing: ‘What emerged was the notion of business clients as assessors, and the critical notion that in a business scenario you do not know exactly what you will be assessed on. Indeed, it could be said to be part of the assessment itself, in a business context, to work out exactly what is expected from you based on the client and the information that they may or may not be providing.This raises the challenging question: Are we being too specific in detailing exactly how students get marks from our assessments? Should part of the assessment be the task of working out which are the more crucial parts of the assessment itself? 3. In a webinar to discuss the baseline findings some projects expressed surprise at how ‘clunky’ their admin processes appeared as a result of their reviews. It is clear that the diversity of admin processes within each institution (devolved responsibility meaning each faculty or school does things its own way) is a barrier to getting the most out of the IT systems that support A&F. It is also notable that although admin staff are identified as stakeholders they tend to be the people that ‘we haven’t got round to talking to yet’. Given the dual emphasis of the programme on effectiveness and efficiency it is evident that admin processes need to be a significant focus for improvement.
  • 1. Although the reviews identified a large variety of assessment types in use it appears that traditional forms such as essays/exams still predominate. Project approaches vary from trying to introduce greater awareness of different forms of assessment to trying to improve practice in relation to the most commonly used types. Both staff and students need to be convinced of the benefits of peer-to-peer learning.2. Along with consistency, the timeliness of feedback appears to be a key issue across the board. Many schools/departments do set deadlines for marking and the return of feedback but there remain issues with the overall assessment timetable and whether students receive feedback in time for it to be useful in preparing for the next assignment. Where this does occur there is substantial evidence that students do appreciate and act upon feedback.3. The extent to which feedback supports the ongoing development of the individual learner by feeding forward into their future learning depends on the quality of the feedback given and whether it is developmental in nature rather than simply justifying a grade.In some cases curriculum design presents structural issues that can provide barriers to this type of development arising from modularisation and from distributed teaching teams that inhibit continuity in terms of establishing a relationship with the student or a shared context for the assessment task. A number of the projects in the programme are looking at ways to better facilitate the joining up of the overall learning experience. One project is looking to develop the concept of an 'assessment career’ building upon a range of previous research including work on ipsative assessment whereby feedback acknowledges progress against the learner's previous performance regardless of achievement, another is aiming for process change to provide a longitudinal overview of feedback provided and student reflection on previous assignments and a further project is focusing on whole programme transformation to support coherent approaches to student learning through assessment design focused at the level of the degree programme rather than the module.
  • JISC Assessment pagesProvides:Background to JISC’s activities with technology enhanced assessment Link to the latest JISC publication on assessment – Effective assessment in a digital age Information on previous projects and activities in the assessment spaceResources including activities to run with stakeholder groups developed as part of a series of national workshops – including overview of the leading educational principles around good assessment and feedback practice.
  • JISC Assessment pagesProvides:Background to JISC’s activities with technology enhanced assessment Link to the latest JISC publication on assessment – Effective assessment in a digital age Information on previous projects and activities in the assessment spaceResources including activities to run with stakeholder groups developed as part of a series of national workshops
  • A range of different types of tools, case studies, models, learning design, lessons learned etc which support teams in designing, developing and delivering curriculum in their institutions.
  • As part of the assessment and feedback programme and the excellent work of Ros Smith and the synthesis team we’ve now compiled a set of assessment and feedback pages, taking you through the key themes around assessment and feedback, which will provide a framework for compiling and making sense of project outputs as they arise.
  • Go to ‘View’ menu > ‘Header and Footer…’ to edit the footers on this slide (click ‘Apply’ to change only the currently selected slide, or ‘Apply to All’ to change the footers on all slides.
  • EBEAM – Evaluating the Benefits of Electronic Assessment – evaluation of Turnitin tools including Grademark, eRater and Quickmark across the school of Music, Humanities and Media.EEVS – Evaluating Electronic Voting Systems for Enhancing the Student Experience. Use for formative and summative uses.EFFECT – evaluating feedback for e-learning: centralized tutors. Evaluating an online system for centralising tutor support and feedback. University of Glamorgan – evaluating online marking through Grademark, and an in-house assessment diary system which enables the sharing of submission and return dates between staff and students. OCME – Online Course Management Evaluation. Evaluating an online course management and feedback system based around Moodle and Turnitin. MACE – Making Assessment Count Evaluation. Evaluating a process to encourage learners to reflect on their feedback (eReflect) process in a number of different institutions. SG4CL – Student Generated Content for Learning: Enhancing Engagement, Feedback and Performance. Evaluation the Peerwise open source tool that enables the development of student created questions.
  • Transcript

    • 1. #jiscasses www.jisc.ac.uk/assessmentandfeedbacAssessment and Feedback programme 24th April 2012
    • 2. Overview Overview of programme, strands and deliverableswww.jisc.ac.uk/assessmentandfeedback #JISCASSESS
    • 3. Programme overviewStrand A Strand B Strand C8 Projects 8 projects 4 projects 6 months 9 months 3 years2011-2014 to 2 years to 2 years 2011-2013 2011-2013 Support and Synthesis Project
    • 4. Locations
    • 5. Programme level outcomes Increased usage of appropriate technology-enhanced assessment and feedback, leading to: – Change in the nature of assessment – Efficiencies, and improvement of assessment quality – Enhancement of the student and staff experience Clearly articulated business cases Models of sustainable institutional support, and guidance on costs and benefits Evidence of impact – on staff and students, workload and satisfaction
    • 6. Strand A goals and objectives Improved student learning and progression Enhanced learning and teaching practice Integrated Increased strategies, efficiency policies & processesOverarching goals from Strand A projects synthesised from their bid documents.
    • 7. Deliverables A B C Baseline Description of user Evaluation scenarios report report Descriptions of the Summary of technical modelprevious work in the area Range of Open source assets - widgets and code Evaluation evidence of report impact Developer guidelines Range of assets - Short briefing Documentation for users evidence of paper impact summarising Active community the of usersGuidance and innovation support and benefits Short summary of materials the innovation
    • 8. Technologies
    • 9. Themes and challenges
    • 10. Programme and support teamwww.jisc.ac.uk/assessmentandfeedback #JISCASSESS
    • 11. Programme Support Team Critical Friends SupportProgramme Co- Evaluation Team Support ordinator Synthesis
    • 12. What are we learning about technology-enhanced assessment and feedback practices?www.jisc.ac.uk/assessmentandfeedback #JISCASSESS
    • 13. Why baseline?Programme Level View of landscape & direction of travel Validate aims & rationale Shared understanding Identify synergies with other work Deliver effective support
    • 14. Why baseline?Project Level View of landscape & direction of travel Validate scope Confirm/Identify challenges Identify stakeholders Manage & communicate scope Challenge mythsIdentify readiness for change Show evidence of improvementImportant stage of engagement/ownership
    • 15. Sources of baseline evidence structured and semi-  institutional QA structured interviews (some documentation video)  reports by QAA, OFSTED & workshops and focus external examiners groups  course evaluations process maps  student surveys rich pictures  quantitative analysis of key institutional (and devolved) data sets strategy & policy  data from research projects documents  questionnaires
    • 16. Differences in emphasis
    • 17. Differences in emphasis
    • 18. Are our projects typical of the landscape?
    • 19. Issues: strategy / policy / principlesFormal strategy/policy documents lag behind current thinkingEducational principles are rarely enshrined in strategy/policyDevolved responsibility makes it difficult to achieve parity of learner experience
    • 20. Issues: stakeholder engagementLearners are not often actively engaged in developing practiceAssessment and feedback practice does not reflect the reality of working lifeAdministrative staff are often left out of the dialogue
    • 21. Finding: assessment and feedback practiceTraditional forms such as essays/exams still predominateTimeliness of feedback is an issueCurriculum design issues inhibit longitudinal development
    • 22. Key resourceswww.jisc.ac.uk/assessmentandfeedback #JISCASSESS
    • 23. http://www.jisc.ac.uk/assessment
    • 24. http://www.netvibes.com/jiscinfonet#%23jiscassess
    • 25. http://jiscdesignstudio.pbworks.com
    • 26. Assessment & Feedback hub pages Peer assessment & review Assessment Effectiveness managementAsse Asset & efficiency in t assessment Employability Transforming & assessment Assessment & Feedback Authentic Work-basedAsset assessment learning & assessment Longitudinal & Assessment for ipsative learning Feedback & assessment feed forward http://tinyurl.com/jiscafds
    • 27. ActivityDecide if you agree or disagree with each of thestatements made on the previous slides(as being representative of mainstream practice in thesector) If you agree – state examples of what can be done about it If you disagree – state examples of evidence to the contrary
    • 28. © HEFCE 2012The Higher Education Funding Council for England,on behalf of JISC, permits reuse of this presentationand its contents under the terms of the Creative CommonsAttribution-Non-Commercial-No Derivative Works 2.0 UKEngland & Wales Licence.http://creativecommons.org/licenses/by-nc-nd/2.0/uk slide 28
    • 29. Evidence and evaluation projects – Strand B EBEAM – University of  OCME – University of Huddersfield Exeter EEVS – University of  MACE – University of Hertfordshire Westminster EFFECT – University of  SG4CL – University of Dundee Edinburgh The evaluation of Assessment Diaries and Grademark – University of Glamorgan
    • 30. Timings 11.15 – 11.35: Participants move round all 3 rooms to look at the 7 posters and have short introductory discussions with projects – Identify 3 projects you’d like to know more about 11.35 – 11.50: Discussion with Project 1 11.50 – 12.05: Discussion with Project 2 12.05 – 12.20: Discussion with Project 3
    • 31. Rooms Proceed - Student-Generated Content for Learning: Enhancing Engagement, Feedback and Performance (SGC4L project), Judy Hardy, University of Edinburgh Evaluating Electronic Voting Systems for Enhancing Student Experience (EEVS project), Amanda Jefferies, University of Hertfordshire Propel 1 - Making Assessment Count Evaluation, (MACE Project), Gunter Saunders and Peter Chatterton, University of Westminster, Mark Kerrigan, University of Greenwich and Loretta Newman-Ford, Cardiff Metropolitan University Evaluating feedback for e-learning: centralized tutors (EFFECT project), Aileen McGuigan, University of Dundee Propel 2 - Evaluating the Benefits of Electronic Assessment Management, (EBEAM project), Cath Ellis, University of Huddersfield Online Coursework Management Evaluation (OCME project), Anka Djordjevic, University of Exeter The Evaluation of Assessment Diaries and GradeMark at the University of Glamorgan - Karen Fitzgibbon and Sue Stocking, University of Glamorgan

    ×