• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Full presentation for hertfordshire mjpkh
 

Full presentation for hertfordshire mjpkh

on

  • 362 views

Presentation given at the 6th International Blended Learning Conference

Presentation given at the 6th International Blended Learning Conference

Statistics

Views

Total Views
362
Views on SlideShare
362
Embed Views
0

Actions

Likes
0
Downloads
4
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Full presentation for hertfordshire mjpkh Full presentation for hertfordshire mjpkh Presentation Transcript

    • Making Assessment Count Consortium (e-Reflect) Presenting today: Sian Lindsay and Kate Reader (City University London) and Gunter Saunders (University of Westminster) The Team: Mark Clements (University of Westminster), Mark Gamble (University of Bedfordshire),Rae Karimjee (City University London), Mark Kerrigan and Simon Walker (University of Greenwich), Loretta Newman-Ford (University of Wales Institute, Cardiff), Maria Papaefthimiou (University of Reading)
    • Making Assessment Count Consortium
      • an informal group that has a collective interest in benefiting from, and further developing and adapting, best practice in the use of technology to enhance the effectiveness and use of feedback;
      • Realise the benefits of the JISC funded projects such as  Making Assessment Count  and other projects the partners have or are involved in.
    • The Original Making Assessment Count Project What did we do?
        • Sought to address issues around feedback by:
        • Developed a framework for action on feedback called the SOS model;
          • To encourage students and staff to work through a process designed to promote action on feedback
          • To link action on feedback to the personal tutorial system
        • Developed a ‘ small piece of technology (software we call e-Reflect) ’ to:
          • Help the student strategically reflect on their feedback
          • Connect the student ’ s feedback and their reflections to the personal tutor
    • SUMMARY OVERVIEW DIAGRAM OF MAKING ASSESSMENT COUNT
    • Encouraging students to reflect on their feedback
        • e-reflect questionnaire encourages students to reflect on key aspects of their performance
        • In response to selected answers, the tool automatically provides additional feedback (student generated) and prompts to stimulate deeper reflection
        • Questionnaires can be authored and aligned for specific types of coursework
    • Benefits to staff, benefits to students, benefits to the institution
        • Staff
          • personal tutors provided with more timely information on their tutees overall performance;
          • Clear framework for provision of feedback.
        • Student
          • Additional layer of feedback facilitating strategic reflection;
          • Strategy for using and acting on feedback.
        • Institution
          • Raising of the profile of feedback (what it ’ s for and how it should be exploited within a broader assessment strategy);
          • Support for shifting of emphasis of personal tutorial to academic performance.
    • Making Assessment Count Consortium Progress
      • UWIC – piloted with up to 90 Sports Medicine undergraduates
      • Reading – planned pilot with over 100 final year project students
      • Westminster – used now to support feedback on written exams
      • City – planned pilot incorporating use of Moodle with Politics undergraduates
    • Making Assessment Count Consortium UWIC
      • Students in a recent focus group very positive
      • Students said the process had made them think about:
        • their own strengths and weaknesses
        • how they approached the assessment task
        • areas for future development
        • usefulness of tutor feedback
    • Making Assessment Count Consortium Westminster
      • Piloting use of the MAC process with exams
      • 2 stage process
        • Questionnaire immediately after the exam, students predict their mark and reflect on why they think this
        • Subsequently given an opportunity to see and annotated copy of marked script
        • Then complete a second questionnaire and add to the content of their learning journal
    •  
    • Kate Reader Dr Sian Lindsay Senior Educational Technologist Schools of Arts and Social Sciences City University London Lecturer in Learning Development Learning Development Centre City University London
      • Encouraging our students to reflect and learn from their feedback
      • May improve student retention
      • To engage more students with the personal tutorial system at City
      • Encouraging students to take a feedforward approach to work
      • We took an ‘integrative’ evaluation approach (Cook, 2002)
      • whereby we looked at how eReflect could be best integrated with other
      • resources available to our users. In our case, the resource was our new
      • virtual learning environment Moodle.
      • Students – how easy is eReflect to use? How enjoyable is it to use? Does eReflect feel personalised to me and meet my individual needs? Will eReflect help me to perform better in my studies?
      • Personal Tutors (teaching staff) – how easy is eReflect to use? How much additional work will I have to do over what I am doing already with my personal tutorials to use it effectively? Does eReflect look professional and does it integrate with the course content available on Moodle?
      • Support Staff – what are the hardware implications of eReflect? Who hosts it? What are the staff development and training implications of eReflect?
      • Developers – can eReflect be customised? Is it open-source and can we adapt it to better suit our needs? How do we report problems with bugs in the code?
      • Managers – how does eReflect fit within the institutional strategy? Can it enhance the status and attractiveness of the institution? What are the costs of running eReflect?
    • Strenghts Weaknesses Opportunities Threats Easy interface- user friendly New system – training needs? Open Source – possible future integration Reliant on Westminster IT infrastructure for support 3 years of experience and results at Westminster Doesn ’ t utilise single sign on No IT central support at City – not scalable for future? No integration = tutors don ’ t get holistic view of students work Difficulties engaging staff and student in new tech so soon after moodle launch? Inflexible = staff can not amend to fit with current feedback/tutorial processes Plain text editor only, not suitable for math/formula based feedback No clear linkage of diagnostic questionnaires to broader module content – student error easy
    • Strengths Weaknesses Opportunities Threats Successfully piloted for 3 years at Westminster Dependent on staff and students being comfortable with an electronic interface No resistance to moodle: Staff are highly engaged with the moodle project, are actively using it to embed T&L activities, all staff had recent training. Staff and students would need to realise the benefits in order to actively engage in the model. The Model could integrate well with our current tutorial and feedback processes. 100% online electronic submission already in place, already implemented models for online feedback via moodle. The model has not been piloted with different disciplines, no evidence this will work with Social Sciences. Staff are currently engaged with a blended learning model, due to the moodle roll out. If we use moodle, all students automatically receive an induction and support/help for students is already in place. Students are already familiar with the Quiz and assignment submission features of moodle. The model could improve student retention, and enhance our current feedback and tutorial models.
    •  
      • Small scale pilot within the Politics Department in September 2011.
      • Staff in the politics department have amended the SOS model to fit with their current personal tutoring system.
      • If successful possible roll out of the model in other departments from September 2012.
    • Conclusions and Future Work
      • The MAC process can benefit students
      • Brings them into greater/better ‘ contact ’ with their feedback and tutors
      • Variations of the MAC process can emerge to suit departmental/institutional context
      • MAC process not limited to being used with marked coursework
      • Generally institutions would prefer to see the MAC process happen within their VLE
    • Acknowledgements and Thanks To all the staff and students who participated across all institutions To JISC for supporting the work To you all for listening. Making Assessment Count