Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
#jiscasses            www.jisc.ac.uk/assessmentandfeedbacAssessment and Feedback programme           24th April 2012
Overview          Overview of programme,          strands and deliverableswww.jisc.ac.uk/assessmentandfeedback            ...
Programme overviewStrand A                 Strand B             Strand C8 Projects               8 projects           4 pr...
Locations
Programme level outcomes Increased usage of appropriate technology-enhanced  assessment and feedback, leading to:    – Ch...
Strand A goals and objectives                              Improved                               student                 ...
Deliverables    A               B                  C  Baseline                       Description of user                 E...
Technologies
Themes and challenges
Programme and support teamwww.jisc.ac.uk/assessmentandfeedback                           #JISCASSESS
Programme Support Team             Critical             Friends            SupportProgramme      Co-        Evaluation   T...
What are we learning about           technology-enhanced         assessment and feedback                 practices?www.jis...
Why baseline?Programme Level View of landscape & direction of travel Validate aims & rationale Shared understanding Id...
Why baseline?Project Level View of landscape & direction of travel Validate scope Confirm/Identify challenges Identify...
Sources of baseline evidence structured and semi-            institutional QA  structured interviews (some      document...
Differences in emphasis
Differences in emphasis
Are our projects typical of the landscape?
Issues: strategy / policy / principlesFormal strategy/policy documents lag behind current thinkingEducational principles...
Issues: stakeholder engagementLearners are not often actively engaged in developing practiceAssessment and feedback prac...
Finding: assessment and feedback practiceTraditional forms such as essays/exams still predominateTimeliness of feedback ...
Key resourceswww.jisc.ac.uk/assessmentandfeedback                              #JISCASSESS
http://www.jisc.ac.uk/assessment
http://www.netvibes.com/jiscinfonet#%23jiscassess
http://jiscdesignstudio.pbworks.com
Assessment & Feedback hub pages                               Peer                            assessment                  ...
ActivityDecide if you agree or disagree with each of thestatements made on the previous slides(as being representative of ...
© HEFCE 2012The Higher Education Funding Council for England,on behalf of JISC, permits reuse of this presentationand its ...
Evidence and evaluation projects – Strand B EBEAM – University of         OCME – University of  Huddersfield            ...
Timings 11.15 – 11.35: Participants move round all 3 rooms to look at  the 7 posters and have short introductory discussi...
Rooms Proceed - Student-Generated Content for Learning: Enhancing Engagement,  Feedback and Performance (SGC4L project), ...
Upcoming SlideShare
Loading in …5
×

Assessment and Feedback programme update (April 2012)

539 views

Published on

Update on the A&F programme at the JISC Learning and Teaching Experts mtg April 2012

Published in: Education, Technology, Business
  • Be the first to comment

  • Be the first to like this

Assessment and Feedback programme update (April 2012)

  1. 1. #jiscasses www.jisc.ac.uk/assessmentandfeedbacAssessment and Feedback programme 24th April 2012
  2. 2. Overview Overview of programme, strands and deliverableswww.jisc.ac.uk/assessmentandfeedback #JISCASSESS
  3. 3. Programme overviewStrand A Strand B Strand C8 Projects 8 projects 4 projects 6 months 9 months 3 years2011-2014 to 2 years to 2 years 2011-2013 2011-2013 Support and Synthesis Project
  4. 4. Locations
  5. 5. Programme level outcomes Increased usage of appropriate technology-enhanced assessment and feedback, leading to: – Change in the nature of assessment – Efficiencies, and improvement of assessment quality – Enhancement of the student and staff experience Clearly articulated business cases Models of sustainable institutional support, and guidance on costs and benefits Evidence of impact – on staff and students, workload and satisfaction
  6. 6. Strand A goals and objectives Improved student learning and progression Enhanced learning and teaching practice Integrated Increased strategies, efficiency policies & processesOverarching goals from Strand A projects synthesised from their bid documents.
  7. 7. Deliverables A B C Baseline Description of user Evaluation scenarios report report Descriptions of the Summary of technical modelprevious work in the area Range of Open source assets - widgets and code Evaluation evidence of report impact Developer guidelines Range of assets - Short briefing Documentation for users evidence of paper impact summarising Active community the of usersGuidance and innovation support and benefits Short summary of materials the innovation
  8. 8. Technologies
  9. 9. Themes and challenges
  10. 10. Programme and support teamwww.jisc.ac.uk/assessmentandfeedback #JISCASSESS
  11. 11. Programme Support Team Critical Friends SupportProgramme Co- Evaluation Team Support ordinator Synthesis
  12. 12. What are we learning about technology-enhanced assessment and feedback practices?www.jisc.ac.uk/assessmentandfeedback #JISCASSESS
  13. 13. Why baseline?Programme Level View of landscape & direction of travel Validate aims & rationale Shared understanding Identify synergies with other work Deliver effective support
  14. 14. Why baseline?Project Level View of landscape & direction of travel Validate scope Confirm/Identify challenges Identify stakeholders Manage & communicate scope Challenge mythsIdentify readiness for change Show evidence of improvementImportant stage of engagement/ownership
  15. 15. Sources of baseline evidence structured and semi-  institutional QA structured interviews (some documentation video)  reports by QAA, OFSTED & workshops and focus external examiners groups  course evaluations process maps  student surveys rich pictures  quantitative analysis of key institutional (and devolved) data sets strategy & policy  data from research projects documents  questionnaires
  16. 16. Differences in emphasis
  17. 17. Differences in emphasis
  18. 18. Are our projects typical of the landscape?
  19. 19. Issues: strategy / policy / principlesFormal strategy/policy documents lag behind current thinkingEducational principles are rarely enshrined in strategy/policyDevolved responsibility makes it difficult to achieve parity of learner experience
  20. 20. Issues: stakeholder engagementLearners are not often actively engaged in developing practiceAssessment and feedback practice does not reflect the reality of working lifeAdministrative staff are often left out of the dialogue
  21. 21. Finding: assessment and feedback practiceTraditional forms such as essays/exams still predominateTimeliness of feedback is an issueCurriculum design issues inhibit longitudinal development
  22. 22. Key resourceswww.jisc.ac.uk/assessmentandfeedback #JISCASSESS
  23. 23. http://www.jisc.ac.uk/assessment
  24. 24. http://www.netvibes.com/jiscinfonet#%23jiscassess
  25. 25. http://jiscdesignstudio.pbworks.com
  26. 26. Assessment & Feedback hub pages Peer assessment & review Assessment Effectiveness managementAsse Asset & efficiency in t assessment Employability Transforming & assessment Assessment & Feedback Authentic Work-basedAsset assessment learning & assessment Longitudinal & Assessment for ipsative learning Feedback & assessment feed forward http://tinyurl.com/jiscafds
  27. 27. ActivityDecide if you agree or disagree with each of thestatements made on the previous slides(as being representative of mainstream practice in thesector) If you agree – state examples of what can be done about it If you disagree – state examples of evidence to the contrary
  28. 28. © HEFCE 2012The Higher Education Funding Council for England,on behalf of JISC, permits reuse of this presentationand its contents under the terms of the Creative CommonsAttribution-Non-Commercial-No Derivative Works 2.0 UKEngland & Wales Licence.http://creativecommons.org/licenses/by-nc-nd/2.0/uk slide 28
  29. 29. Evidence and evaluation projects – Strand B EBEAM – University of  OCME – University of Huddersfield Exeter EEVS – University of  MACE – University of Hertfordshire Westminster EFFECT – University of  SG4CL – University of Dundee Edinburgh The evaluation of Assessment Diaries and Grademark – University of Glamorgan
  30. 30. Timings 11.15 – 11.35: Participants move round all 3 rooms to look at the 7 posters and have short introductory discussions with projects – Identify 3 projects you’d like to know more about 11.35 – 11.50: Discussion with Project 1 11.50 – 12.05: Discussion with Project 2 12.05 – 12.20: Discussion with Project 3
  31. 31. Rooms Proceed - Student-Generated Content for Learning: Enhancing Engagement, Feedback and Performance (SGC4L project), Judy Hardy, University of Edinburgh Evaluating Electronic Voting Systems for Enhancing Student Experience (EEVS project), Amanda Jefferies, University of Hertfordshire Propel 1 - Making Assessment Count Evaluation, (MACE Project), Gunter Saunders and Peter Chatterton, University of Westminster, Mark Kerrigan, University of Greenwich and Loretta Newman-Ford, Cardiff Metropolitan University Evaluating feedback for e-learning: centralized tutors (EFFECT project), Aileen McGuigan, University of Dundee Propel 2 - Evaluating the Benefits of Electronic Assessment Management, (EBEAM project), Cath Ellis, University of Huddersfield Online Coursework Management Evaluation (OCME project), Anka Djordjevic, University of Exeter The Evaluation of Assessment Diaries and GradeMark at the University of Glamorgan - Karen Fitzgibbon and Sue Stocking, University of Glamorgan

×