The Assessment Journey

49 views

Published on

An overview of current practice and activities

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
49
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
1
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • In 2012 a baseline review undertaken by 8 institutions showed:
    strategy and policy- although there maybe an overall assessment strategy, responsibility for implementing it is devolved to departments leading to variation in assessment and feedback practices- difficult to achieve parity of experience for learners.
    strategy documents tend to be quite procedural in focus and don’t reflect the value that assessment can bring to learning.
    When it comes to academic practice the issues are varied and complex but include the emphasis on summative assessment and the persistence of traditional forms such as essays/exams.
    Timeliness, along with quality and consistency of feedback, was an issue across the board. Even where clear deadlines are set there isn’t always in time to feed into next assignment.
    There is a perception that learners don’t engage with the feedback they receive. Tutors may feel they have given a lot of feedback and support but it hasn’t been acted upon. Learners are seen as passive – waiting for feedback to be delivered to them but the reality is less clear cut as the value of acting on feedback is not always well-communicated.
    And finally, the assessment and feedback process, particularly the emphasis on high-stakes assessment and the value that is placed on marks and grades, is very different to the formative ways professionals develop during their working life, where much value is gained from feedback from for examples peers.

    FE survey and sessions with digital leaders in 2015
    The different activities brought consistent set of messages
    Again even where there were assessment strategies in place, responsibility was unclear and devolved.

    In the survey, respondents were asked to identify up to 5 significant barriers
    Not surprisingly, the real or perceived lack of funding to implement wide-scale use was identified by just over half of respondents.
    And next was staff resistance and cultural concerns – 42% of respondents said they would prefer to continue with existing methodologies. Related to lack of confidence and skills with the digital.
    Also the lack of leadership – in terms of a consistency of message from Government, Ofqual, Oftsed and awarding organisations.
    38% reported the technology was difficult to implement, and issues with scaling up
    Infrastructure issues included lack of IT support (73%), availability of PCSs and wifi, and capacity issues e.g. around online testing with availability of rooms etc.



  • Since the survey was completed we have been engaged in conversations with a number of key stakeholders, including BIS, awarding organisations and regulators to ensure we have a joined up approach to tackling the challenges discussed today.

    We have gained agreement from representative bodies including the Federation of Awarding Bodies, the eAssessment Advisory Group and eAssessment Association to move forward with a number of next steps in the first instance – highlighted here.
  • We have already made a start. Jisc, in collaboration with stakeholders, including FE and skills providers, have produced an online guide which aims to start the process of working towards a consistent and positive message, clarifying value and benefits.

    The guide aims to show the value that technology can add to assessment and feedback, as well as clarifying terminology and setting the context in the light of sector changes. It also includes a range of example of effective practice across FE and skills, to inspire with what is possible.

    The guide introduces the assessment and feedback lifecycle (originally developed by MMU) as way of serving as a common framework to starting conversations around assessment and feedback; it shows a high level view of the processes involved in any assessment, from formative to summative, short quiz to end point assessment.

    It offers a means of encouraging dialogue between different types of stakeholders who may work on one aspect of assessment and thus have a view of only part of the life-cycle. So far the model has resonated with everyone we have spoken to in the course of the research. And is useful because if offers a ready means of mapping business processes and potential supporting technologies against this.

    The guide introduces the model, but also provides a way in to much more detail around how technology can support all aspects of this lifecycle in a fuller guide.



  • Wanted to move away from thinking about e-assessment as onscreen tests – to thinking more holistically about all the many ways that technology can support all aspects of this lifecycle.

    The case studies highlight the many benefits that technology can bring to all aspects of the assessment lifecycle, from efficiencies in terms of the management of the process and more time feedback; to enhancements including the ability to provide richer feedback in the same amount of time that you’d share written feedback; and ensuring a process where the learners are more in control over their learning.
  • An example of one of our case studies is Walsall College. Jayne Holt, Assistant Principal will be sharing their story later today, and also in a podcast in the guide.

    They have been rated as outstanding by Ofsted - the way in which its joined up approach to assessment and feedback has empowered learners and enhanced their learning is an important factor.

    The College has implemented an approach to support the whole of the assessment and feedback life-cycle.

    Benefits:
    The previous arrangements led to a situation where hard-working tutors were creating increasingly passive learners. Now there is learner control and a level of transparency that allows learners to question what is happening, with feedback in one place.

    Although efficiency savings were not the key driver for EMA implementation, the College has saved tens of thousands of hours of staff time that previously went into low value administrative activity i.e. hand writing and then retyping forms that are now generated automatically. This has also improved the quality of data by getting rid of the need to transcribe handwriting. The system also generates all of the evidence needed for HEFCE audit of its HE level provision.
  • And in a completely different example, focusing on assessment practice rather than process, Swindon College have provided learners with their own personal learning space in the form of the Mahara e-portfolio.

    The college wanted to provide learners with greater choice and flexibility on how evidence is presented for assessment, to put learners in control of their own learning. As well as enabling learners to gather relevant rich evidence of their skills and achievements for future employers.

    Learners are also developing digital literacy skills, as well as decision making skills as they choose what and how to present. And gaining feedback from a range of audiences to help refine their work.

    Once ready for assessment the relevant URLs of pages chosen by learners are copied into Moodle for grading.

    The college have found that using personal tools like this can transform learner’s sense of ownership of their work and consequently their standard.
  • At the start of the course, BTEC media learners are given accounts for social media tools so that they begin immediately to set targets for themselves, discuss, share and critically evaluate their own and others’ work using platforms such as Twitter, Tumblr or WordPress blogging tools or a closed group on Facebook.

    Teachers stress the importance of building and maintaining a positive digital reputation; what you do or say online is visible to everyone. But contrary to expectations, the public nature of the tools can instil mature attitudes.

  • Based on the NUS/Jisc tool on the digital student experience.

    Brings together ‘good practice’ statements around assessment design, delivery and management, and enables you to see what that would look like, in practice, within your contexts.

    And provides some self-assessment tools at the end for you to consider where you are in relation to each of those.
  • Not focused specifically at FE colleges but worth mentioning here is some work we did to help universities review their business processes around in particular managing submission, marking and feedback. Because it was found that even within one institution different depts. were handling this in different ways; and that suppliers were finding it hard to navigate their way around the range of approaches.

    3 ‘ideal state’ process maps, reducing them to their most efficient from the student submitting a piece of work, to their marks being returned. One map showing a top level for submission, marking and feedback. And detailed maps for submission / marking and feedback

    Outlines the flow of tasks, and who’s responsibility this could/should be. And highlights systems requirements at each stage.

    Prompting for eg. questions around
    Are you doing additional tasks - if so, why?
    Are the tasks being done by the right people eg do you have academic staff undertaking administrative duties that don’t need to be
    Do you have systems that could carry out some of the tasks you are doing manually?
    Do you have multiple ways of performing the same task - if so, why?

    For each of the process maps the system requirements relating to each stage are highlighted.

    These requirements can either be looked at in relation to the processes, or as a full list of requirements to support full EMA process.

    This list was shared with suppliers earlier this year, and they have shared how they meet those requirements.
  • Suppliers responded to a template of requirements, all responses to which are now collated in one spreadsheet.
    Not indicating any preference, and not validated as they are self reported.
    The list of requirements relates solely to EMA and may not represent the full functionality of the systems included here eg student record systems cover many functions other than assessment
    This listing includes products intended to cover most of the EMA lifecycle as well as some more niche products. It is intended as a means of identifying which combination of products could meet your needs. It is not a like-for-like comparison of similar systems.
  • We’ll mention more on this later this afternoon, but just to highlight here that we are just taking forward a new project focusing on surfacing the value that technology can play in supporting the end to end management, delivery and assessment of new apprenticeships.

    The approach is drawing on some of this previous research in the assessment space, where we go back to exploring the key processes involved in delivery, and use that as a way of surfacing good practice and where technology can add value. We’d very much be interested in identifying who’d be interested in working with us as part of a working group, more to come later.


  • Gill
  • Gill
  • Sues slides
  • Apprenticeships is a growth area in FE& Skills and through the area review process providers have been challenged to increase their delivery of apprenticeships.  

    With a government target of 3m starts by 2020 and a history of  just over 2.2m apprenticeship starts over the five academic years from 2009/10 to 2013/14 this represents a huge increase.

    Only through the effective use of technology, can this target be delivered.  

    So we are undertaking some exploratory work to inform the development and delivery of the new apprenticeship qualifications, focused on supporting decision making around the most effective use of technology.

    The focus is on articulating an ‘ideal state’ where technology is used to best effect to maximise the benefits technology can offer, both in terms of cost-efficiencies and learning enhancements. The audience for this work is those involved in the development and delivery of the new apprenticeship standards.
    includes the Institute for Apprenticeships, FE colleges, skills providers, employers and awarding organisations.  

  • Doesn’t have to be things Jisc can take forward, but other agencies too.
  • The Assessment Journey

    1. 1. 20/10/2016 The Assessment Journey: an overview of current practice and activities
    2. 2. Overview »The story so far »Assessment challenges »What we are doing to address those challenges »Links to tools and resources #jiscassess 20/10/2016
    3. 3. The story so far…… » 2011-2014: worked with over 30 institutions across a three year programme on assessment and feedback » 2012: baseline reviews of institutional practice » 2013: case studies, videos and briefings » 2014: Electronic management of assessment (EMA) project with HELF and UCISA » July 2014: initial landscape study into EMA published Summer 2014 » 2015: survey of FE and skills providers » 2016: FE guidance, case studies, benchmarking, all outputs from the EMA project Background and context 20/10/2016 CC BY-NC-SA 2.0 ©comedynose via Flickr
    4. 4. Assessment and feedback challenges › Highly devolved responsibility and inconsistent practices › Lack of developmental focus › Traditional practices dominate › Timeliness, quality and consistency of feedback › Learner in passive role › Lack of relevance to world of work Assessment and feedback institutional reviews (2012) » Over half had an e-assessment strategy, but lines of responsibility unclear and devolved » Lack of funding » Cultural concerns » Lack of leadership » Infrastructure and logistics » Technology FE and skills survey and conversations with digital leaders 2015) 20/10/2016
    5. 5. What has happened since » Awarding organisations through the Federation of Awarding Bodies and the eAssessmentAdvisory Group » The eAssessmentAssociation » Regulators, CCEA, QualificationsWales and SQA » With supporting statement from Ofqual Joint statement » Provide a consistent and positive message about the use of technology » Demonstrate the value and role of technology-enhanced assessment, and promote the end-to-end benefits » Ensure any potential barriers, perceived or real, to uptake are identified and removed » Explore where admin processes could be harmonised to ensure a consistent experience for providing organisations Agreeing to: 20/07/16 Technology-enhanced assessment in FE and skills – how is the sector doing? 5
    6. 6. New Jisc guide » Why technology? » Defining terms » Issues for FE and skills » A changing sector » Moving forward » Benefits › Organisations › Teaching and training › Learners » Case studies Enhancing assessment and feedback with technology: a guide for FE and skills 20/07/16 Available from bit.ly/Jisc-assessment- guide-FEandSkills Diagram adapted from an original by Manchester Metropolitan University Technology-enhanced assessment in FE and skills – how is the sector doing? 6
    7. 7. From e-assessment to technology-enhanced…. »More efficient management of the assessment process »Enabling more engaging assessments »Giving learners more control over their learning, developing critical self-assessment skills »Providing more timely, more consistent and richer feedback, from multiple audiences »Effective tracking of learners progress »Increasing learners employability and facilitating greater engagement with employers »Showcasing richer evidence of achievement » Flexibility over when and where assessments take place 20/10/2016
    8. 8. Case studies: Walsall College 20/07/16 Technology-enhanced assessment in FE and skills – how is the sector doing? End-to-end electronic management of assessment (EMA) » Benefits: › Increased learner control › Rated ‘outstanding’ by Ofsted and praised for the change in the learner voice › Enhanced quality of data › Enhanced feedback › Tens of thousands of hours of staff time saved » Drivers: › To address student passivity in relation to the assessment and feedback process » Solution: › Using an integrated suite of technologies to support the whole of its assessment and feedback lifecycle 8
    9. 9. Case studies: Swindon College » Drivers: › Need to provide learners with greater choice and flexibility over how evidence is presented for assessment › Need for a personal online space for students to reflect on and present achievements to employers » Solution: › Mahara e-portfolio Empowering learners 20/07/16 Technology-enhanced assessment in FE and skills – how is the sector doing? “Mahara is changing the way people view assessment. Learners own their e-portfolios so make personal choices over how they present themselves and their work.This in itself increases their responsibility, broadens their horizons and encourages thoughtful innovation.” LearningTechnologies Manager, Swindon College 9
    10. 10. Case studies: Basingstoke College ofTechnology »Drivers: › Employability › Achievement and retention »Solution: › Social media for the sharing, discussing, evaluating and showcasing student work Closing the feedback loop “Up to the point of submitting their e-portfolios, our learners are engaged in an endless loop of dialogue which gives them a far better opportunity to evidence their understanding.” Scott Hayden, specialist practitioner of social media and educational technology 20/10/2016
    11. 11. But what does ‘good’ look like? »Draft benchmarking tool »Introduces a set of good practice principles, and identifies what these could look like at different stages of maturity › First steps › Developing › Developed › Outstanding 20/10/2016
    12. 12. Clarifying business processes » Generic process descriptions facilitated clear definition of system requirements » Requirements validated with UCISA » Suppliers responded to requirements using a standard template 20/10/2016
    13. 13. Supplier comparisons 20/10/2016
    14. 14. Next steps: digital apprenticeships »New ‘Co-design theme’ »New project to underpin the theme › Looking to surface where technology can add value to the end to end management, delivery and assessment of apprenticeships 20/07/16 Technology-enhanced assessment in FE and skills – how is the sector doing? 14
    15. 15. Find out more » New guide for FE and skills, with accompanying case studies: bit.ly/Jisc-assessment-guide-FEandSkills › Report from the FE and skills e-assessment survey: ji.sc/eassessment-survey-final-report › Joint statement by key stakeholders: ji.sc/eassessment-survey-report-statement › Accompanying blog post: ji.sc/tech-enhanced-assessment-fe-skills 20/07/16 Technology-enhanced assessment in FE and skills – how is the sector doing? 15
    16. 16. Find out more » Transforming assessment and feedback with technology: http://ji.sc/transforming-assessment-feedback-guide » EMAprocessesand systemguide: http://ji.sc/ema-processes-systems-guide » Supplierresponsestosystemrequirements: http://ji.sc/supplier-responses-ema » The evolution of FELTAG: https://www.jisc.ac.uk/reports/the-evolution-of-feltag Other guides of interest 20/07/16 Technology-enhanced assessment in FE and skills – how is the sector doing? 16
    17. 17. Find out more 20/07/16 » Jointheconversationontheblog: ema.jiscinvolve.org/ » and ontwitter#jiscassess » Jointhemailinglist: jiscmail.ac.uk/tech-enhanced-assessment Technology-enhanced assessment in FE and skills – how is the sector doing? 17
    18. 18. Any questions?20/10/2016
    19. 19. jisc.ac.uk Except where otherwise noted, this work is licensed under CC-BY-NC-ND Find out more… Contact Lisa Gray Senior Co-Design Manager, Student Experience lisa.gray@jisc.ac.uk 20/10/2016
    20. 20. 20/10/2016
    21. 21. FE benchmarking tool »Hybrid origins: »Pedagogy from assessment & feedback work 2011-2014 »Covers effective management practice and use of technology »Self-assessment activity proven useful in HE »Matrix format from Jisc/NUS tool on digital student experience 20/10/2016
    22. 22. FE benchmarking activity »Which headings do you prefer? »Are the indicators in the right columns? »Are there other indicators/examples we should add? »Other suggestions to make the tool more useful to you 30 mins to review 20 mins feedback & discussion 20/10/2016
    23. 23. 20/10/2016
    24. 24. Identifying your challenges » Consider the main challenges facing you and your organisation / staff / students with any aspect of the assessment process »Write one challenge per post-it »Be as specific as you can: › What is the problem › Who does it affect › What is the impact of that problem »Consider what challenges we can impact on: › i.e. not funding/time/Brexit! 20/10/2016
    25. 25. 20/10/2016
    26. 26. Digital Apprentice: New co-design theme 20/10/2016
    27. 27. 1 Discuss emerging challenges 2 Prioritise ideas 3 Announce successful ideas 4 Report progress Identify ideas 31st Oct – 24th Nov 4th Jan– 30th Jan 6th Feb Apr/May Release 6 challenge areas and invite Jisc members and other experts to discuss Audience: managers, consumers, some leaders, other experts Present ideas for activities Jisc could do and ask members which they support Audience: managers, consumers, some leaders Release 6 challenge areas and invite Jisc members and other experts to discuss Audience: everyone who followed the challenge Release 6 challenge areas and invite Jisc members and other experts to discuss Audience: everyone who followed the challenge
    28. 28. The intelligent campus Should we gather more data on students, staff and buildings that would allow us to deliver better experiences? Next generation learning environments We think it is time for a new type of learning environment, but what would this look like? The digital apprentice What would a truly digital apprenticeship look like? Data driven learning gains Can we make better use of data to improve learning, teaching and student outcomes? Digital skills for researchers How do we equip researchers and related staff with the skills they need for the future of research? Next generation research environments We think it is time for a next generation research environment, but what would it look like?
    29. 29. The intelligent campus The digital apprentice Next generation learning environment Data driven learning gains Next generation research environment Digital skills for research Should we gather more data on students, staff and buildings that would allow us to deliver better experiences? We think it is time for a new type of learning environment, but what would this look like? We think it is time for a new type of learning environment, but what would this look like? What would a truly digital apprenticeship look like? Can we make better use of data to improve learning, teaching and student outcomes? How do we equip researchers and related staff with the skills they need for the future of research?
    30. 30. We think it’s time we had fully digital apprenticeships – to meet the needs of employers and apprentices in the 21st Century. • What are the issues and problems you have identified in embedding technology in delivery and assessment of apprenticeships? • What are the opportunities to exploit technology to deliver high quality apprenticeships more effectively? • How can a data-driven approach lead to improved decision making by apprentices, employers and providers? • Can we provide a total digital experience for all apprentices to enhance and support their learning and assessment? • Can a digital approach help provide improved careers advice for apprentices and their parents? The digital apprentice
    31. 31. New Digital Apprentice project »Map the high level processes involved › End to end management, delivery, and assessment »Surface ‘good practice’ points relating to each process »Identify where technology can add value »Detail the system requirements »Suppliers response clarifying how systems meet the needs »Timeframe: › November 2016 to March 2017 »Working group informing the direction of travel › Please sign up today! 20/10/2016
    32. 32. Apprenticeship lifecycle activity »On your tables, draw an apprenticeship lifecycle (or the apprentice’s learning journey if you prefer) › What are the high level processes involved? › Highlight the key pain points from a learning provider perspective 20/10/2016
    33. 33. 20/10/2016
    34. 34. Next steps: what solutions are needed? »Group around a challenge cluster »Explore possible ‘solution’ ideas that might help to address that challenge › Be clear – what does the idea look like?Who is it for? Draw pictures! »Or join the ‘digital apprentice’ table Activity 20/10/2016

    ×