Your SlideShare is downloading. ×
Good pedagogical practice driving learning analytics: OpenMentor, Open Comment and SAFeSEA
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Good pedagogical practice driving learning analytics: OpenMentor, Open Comment and SAFeSEA

450
views

Published on

OpenMentor, Open Comment are two systems that provide immediate feedback to tutors and students about their learning and support of learning. All these systems were built on the formalisation of …

OpenMentor, Open Comment are two systems that provide immediate feedback to tutors and students about their learning and support of learning. All these systems were built on the formalisation of sound pedagogical practice. SAFeSEA is a new EPSRC funded project with Oxford University investigating summarisation techniques to give feedback on draft essays.

Published in: Education

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
450
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • We needed a tried and tested model for categorising the comments in order to code them, so that we could quantify the types of comments that were being provided. We chose to use Bale’s Interactional Categories, as a means of doing this. We then counted the number of incidences of each type of comment within the four main categories. The 12 categories are subcategories of four main types of response, as can be seen. This type of analysis could form the basis of an analysis by computer. Also, by coding each type of response we were then able to go back and identify typical terms or phrases for each of the12 individual categories. Examples of these were……. [next slide]
  • When it came to identifying trends then, In this graph, we compared the distribution of the four main categories of comments within the four levels of pass. The results can be seen in this graph. There is, more or less, a pattern for each standard of pass with regard to the types of comments given by tutors. [talk through graph] (The comments had been coded by two people with a ratability of approximately 89%) So category A shows ‘praise and agreement’ B shows ‘direction and evaluation’ C shows questions and D shows ‘disagreement’ So looking at the four levels of pass….. Etc [talk through graph] So the main objective of this phase of the analysis was to identify a set of trends in the tutor interactions that matched the grade awarded.
  • TODO: finding the right LOGO for Primma
  • Transcript

    • 1. Good Pedagogical Practice Driving LearningAnalytics: OpenMentor, Open Comment and SAFeSEA Denise Whitelock denise.whitelock@open.ac.uk
    • 2. OM, OC and SAFeSEA born frompedagogical models• Building the feedback models in both systems• Finding a good classification system• Including socio motive support in feedback• OM advises how to do it• OC tries to put that advice in practice• SAFeSEA builds on OM and OC STORM October 2012 DMW
    • 3. What is Open Mentor?http://openmentor.kcl.ac.uk/openmentor-web/• “An open source mentoring tool for tutors” • “Open source” = free and easy to use, and to embed in an institutions infrastructure and working practices • “mentoring” = designed to help people learn how to give feedback effectively, through reflection and social networks • “tutors” = primarily intended for teaching staff, but with clear applications for those involved in quality STORM October 2012 DMW
    • 4. Bales Categories• Four main groupings• A. Positive reactions; agreeing and boosting the other person• B. Directing/teaching• C. Questions: requesting information, clarification etc• D. Negative reactions: disagreement STORM October 2012 DMW
    • 5. Coding the commentsBales’ Interaction Process Categories Specific Examples Positive Reactions A1 1. Shows solidarity Jokes, gives help, rewards others A2 2. Shows tension release Laughs, shows satisfaction A3 3. Shows agreement Understands, concurs, complies, passively accepts Attempted Answers B1 4. Gives suggestion Directs, proposes, controls B2 5. Gives opinion Evaluates, analyses, expresses feelings or wishes B3 6. Gives information Orients, repeats, clarifies, confirms Questions C1 7. Asks for information Requests orientation, repetition, confirmation, clarification C2 8. Asks for opinion Requests evaluation, analysis, expression of feeling or C3 9. Asks for suggestion wishes Requests directions, proposals Negative Reactions D1 10. Shows disagreement Passively rejects, resorts to formality, withholds help D2 11. Shows tension Asks for help, withdraws D3 12. Shows antagonism Deflates others, defends or asserts self STORM October 2012 DMW
    • 6. Identifying trends: H801 D Pass 4 D Pass 3 Bales Interactional Categories at each PassLlevel D Pass 2 D Pass 1 C Pass 4 C Pass 3 C Pass 2 C Pass 1 B Pass 4 B Pass 3 B Pass 2 B Pass 1 A Pass 4 A Pass 3 A Pass 2 A Pass 1 0 5 10 15 20 25 Number ofIincidencesGraph to show conflated Bale’s categories against mean number of incidences in H801 scripts STORM October 2012 DMW
    • 7. How Open Mentor handles comments• “Good work” • A = positive reactions• “Yes, well done” • A = positive reactions• “Yes, but is this useful?” • B = attempted answers, and not a positive reaction• “Can you explain what you • C = questions mean”• “This does not follow” • D = negative reactions STORM October 2012 DMW
    • 8. Building the rule set Comments Classified from Training Data 100% of comments classified 90 80 70 60 50 40 30 20 10 0 1 2 3 4 5 6 7 8 Training Iteration STORM October 2012 DMW
    • 9. Is the rule set generic? Comments Classified from Test Data% of comments classified 100 90 80 70 60 50 40 30 20 10 0 B820 Test B820 Test B820 Test A850 M878 S809 Set 1 Set 2 Set 3 Test Data Set STORM October 2012 DMW
    • 10. STORM October 2012 DMW
    • 11. OpenMentor Transfer: JISC funded • JISC funded project • Transfer OpenMentor technology to King’s and Southampton • What changes are needed for cross institutional use? • Identify strengths and limitations of OM for training tutors STORM October 2012 DMW
    • 12. Transferring OM to other HEIs • Transferred to Southampton and Kings London • Participating Tutors given face to face training • Kings College: • 3 tutors. • 25. learning experts at TEL forum gave feedback after a demonstration • Southampton • 3 tutors. • Interviews and questionnaire • Open University • 3 distance education tutors • Questionnaire and epistolary interviews • 113 students in a Masters course focussing on Innovation in eLearning and 5 tutors. STORM October 2012 DMW
    • 13. Lessons learned after completion of first cycle of trials• Open Mentor’s theoretical framework was robust enough to facilitate and encourage dialogue and reflective activities• Tutors positive about the system’s functions to support provision of feedback• Suggestions for change• a module for user authentication and management• the development of OM reports to help tutors to progress towards the ideal ‘state’ of feedback provided.• used for training purposes as an academic development tool.• Our contact details, blog and references -http://omtetra.ecs.soton.ac.uk/wordpress/ STORM October 2012 DMW
    • 14. What can we learn from modelling tutors marking to construct a formative e-assessment tool?• Open Comment project builds on the work of OpenMentor• Free text entry for History and Philosophy students• Immediate feedback (in context) to students• Influenced by ELIZA (Weizenbaum, 1963) STORM October 2012 DMW
    • 15. Causal models of explanation• First step: • Identification of question types where students exhibit causal reasoning• Looked for questions with: • Salient variables • Description of these variables • Identification of trends • Identification of relationship between the variables i.e. causality STORM October 2012 DMW
    • 16. Three common types of question types were selectedfor Open Comment in History1. Analysis of statistics, usually presented to the student as a table2. Comprehension of a set text3. Identifying similarities and differences for a given event STORM October 2012 DMW
    • 17. The tool: an example question STORM October 2012 DMW
    • 18. STORM October 2012 DMW
    • 19. STORM October 2012 DMW
    • 20. STORM October 2012 DMW
    • 21. Stages of analysis by computer of students’ free textentry for Open Comment: advice with respect tocontent (socio-emotional support stylised example)• STAGE 1a: DETECT ERRORS E.g. Incorrect dates, facts. (Incorrect inferences and causality is dealt with below)• Instead of concentrating on X, think about Y in order to answer this question Recognise effort (Dweck) and encourage to have another go• You have done well to start answering this question but perhaps you misunderstood it. Instead of thinking about X which did not…….. Consider Y STORM October 2012 DMW
    • 22. Computer analysis continued• STAGE 2a: REVEAL FIRST OMISSION• Consider the role of Z in your answer Praise what is correct and point out what is missing Good but now consider the role X plays in your answer• STAGE 2b: REVEAL SECOND OMISSION• Consider the role of P in your answer Praise what is correct and point out what is missing Yes but also consider P. Would it have produced the same result if P is neglected? STORM October 2012 DMW
    • 23. Final stages of analysis• STAGE 3:REQUEST CLARIFICATION OF KEY POINT 1• STAGE 4:REQUEST FURTHER ANALYSIS OF KEY POINT 1(Stages 3 and 4 repeated with all the key points)• STAGE 5:REQUEST THE INFERENCE FROM THE ANALYSIS OF KEY POINT 1 IF IT IS MISSING• STAGE 6:REQUEST THE INFERENCE FROM THE ANALYSIS OF KEY POINT 1 IF IT IS NOT COMPLETE• STAGE 7:CHECK THE CAUSALITY• STAGE 8:REQUEST ALL THE CAUSAL FACTORS ARE WEIGHTED STORM October 2012 DMW
    • 24. SAFeSEA: Supportive Automated Feedback for Short Essay AnswersAn automatedtoolsupportingonline writingandassessmentof essaysprovidingaccurate SAFeSEAtargeted Dr Denise Whitelock Professor John Richardsonfeedback Professor Stephen Pulman STORM October 2012 DMW
    • 25. About SAFeSEA • Effect of summarisation • What are the beneficial factors? • Correlate measures of learner activity and essay improvement • Effect of hints • http://www8.open.ac.uk/iet/m ain/research- scholarship/research- projects/supportive- automated-feedback-short- essay-answers STORM October 2012 DMW
    • 26. SAFeSEA• Support for essay writing• Shape landscape of eLearning and Learning Analytics• Improves the student experience• Support advances in NLP STORM October 2012 DMW
    • 27. Final thoughts• There is a growing consensus in the field of assessment that times are changing and that assessment needs to become more embedded/central in the teaching learning cycle (Hatzipanagos & Rochon 2011).• Our project provides another phase in this type of research where the balance of socio emotive content contained in feedback cannot be ignored (Draper, 2009).• Feedback that encourages the student to actively change their ideas and ways of organising their answers and discourse within a given subject domain is what is required and advocated by Whitelock (2011) as “advice for action”. STORM October 2012 DMW
    • 28. “Advice for Action”, Whitelock (2011)• Helping students find out what they do not know and how to remedy the situation can avoid the trauma of assessment• Are we on the way with new e-tools? STORM October 2012 DMW
    • 29. References• Whitelock, D., Gilbert, L., Hatzipanagos, S., Watt, S., Zhang, P., Gillary, P. & Saucedo, A. (2012) Supporting tutors with their feedback using OpenMentor in three different UK Universities. 10th International Conference on Computer Based Learning in Science, CBLIS 2012, Barcelona, Spain. 26-29 June 2012.• Whitelock, D., Gilbert, L. & Gale, V. (2011) ‘Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?’ International Computer Assisted Assessment Conference, DeVere Grand Harbour Hotel, Southampton, 5/6 July 2011. http://caaconference.co.uk/wp- content/uploads/WhitelockB-CAA2011.pdf• Whitelock, D. (2010) Activating Assessment for Learning: are we on the way with Web 2.0? In M.J.W. Lee & C. McLoughlin (Eds.) Web 2.0-Based-E-Learning: Applying Social Informatics for Tertiary Teaching. IGI Global. pp. 319–342. STORM October 2012 DMW
    • 30. References (2)• Whitelock, D. & Watt, S. (2008) ‘Putting Pedagogy in the driving seat with Open Comment: an open source formative assessment feedback and guidance tool for History Students.’ CAA Conference 2008, Loughborough University, 8/9 July 2008, edited by Farzana Khandia pp. 347-356 ISBN 0-9539572-7-6 http://kn.open.ac.uk/public/document.cfm?docid=11638• Whitelock, D. & Watt, S. (2007) e-Assessment: How an we support tutors with their marking of electronically submitted assignments? Ad-Lib Journal for Continuing Liberal Adult Education, Issue 32, March 2007 pp 7-9, ISSN 1361-6323. STORM October 2012 DMW
    • 31. References (3)• Whitelock, D. (2006) Electronic Assessment: Marking, Monitoring and Mediating Learning. In McAndrew, P. and Jones, A. (eds) Interactions, Objects and Outcomes in learning. Special Issue of International Journal of Learning Technology. Vol. 2, Nos 2/3 pp 264-276.• Whitelock, D. & Watt, S. (2006) OpenMentor: opening tutors eyes to the written support given to students in their assignments. JISC Conference 2006, Information & Communication Technology in Education and Research. International Conference Centre, Birmingham, 14 March 2006. STORM October 2012 DMW