Your SlideShare is downloading. ×
Automatic feedback for motivation and self-regulation
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Automatic feedback for motivation and self-regulation

1,316

Published on

Open Seminar by Prof. Denise Whitelock …

Open Seminar by Prof. Denise Whitelock
Open University Institute of Educational Technology
10 May 2013

Published in: Education, Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
1,316
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
10
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • We needed a tried and tested model for categorising the comments in order to code them, so that we could quantify the types of comments that were being provided. We chose to use Bale’s Interactional Categories, as a means of doing this. We then counted the number of incidences of each type of comment within the four main categories. The 12 categories are subcategories of four main types of response, as can be seen. This type of analysis could form the basis of an analysis by computer. Also, by coding each type of response we were then able to go back and identify typical terms or phrases for each of the12 individual categories. Examples of these were……. [next slide]
  • When it came to identifying trends then, In this graph, we compared the distribution of the four main categories of comments within the four levels of pass. The results can be seen in this graph. There is, more or less, a pattern for each standard of pass with regard to the types of comments given by tutors. [talk through graph] (The comments had been coded by two people with a ratability of approximately 89%) So category A shows ‘praise and agreement’ B shows ‘direction and evaluation’ C shows questions and D shows ‘disagreement’ So looking at the four levels of pass….. Etc [talk through graph] So the main objective of this phase of the analysis was to identify a set of trends in the tutor interactions that matched the grade awarded.
  • The results can be seen here once again. This chart illustrates the distribution of each category of comment across all levels of pass. This doesn’t however enable us to identify trends. This gets interesting when we look at the break-down once again… [next slide]
  • We plotted the following charts, and a shift in the distribution of comments among the four interactional categories can be seen. This would suggest (as shown in the previous bar chart) that the number of comments that are questions, increases as the student’s score decreases, and vice versa.
  • TODO: finding the right LOGO for Primma
  • Transcript

    • 1. Automatic feedback for motivation andself-regulationProfessor Denise WhitelockThe Open University, Walton Hall,Milton Keynes MK7 6AA, UKdenise.whitelock@open.ac.uk
    • 2. DMW UOC Open Seminar May 2013
    • 3. DMW UOC Open Seminar May 2013
    • 4. I hate marking but want the tasks and feedback toassist student learningDMW UOC Open Seminar May 2013
    • 5. The e-Assessment and automatic feedbackChallenge• ConstructivistLearning –Push• Institutionalreliability andaccountability –Pull. DMW UOC Open Seminar May 2013
    • 6. DMW UOC Open Seminar May 2013www.storiesabout.comwww.storiesabout.com/creativepdpc.mckillop@rgu.ac.uk
    • 7. DMW UOC Open Seminar May 2013
    • 8. MCQs: Variation on a themeDMW UOC Open Seminar May 2013Example of LAPT Certainty-BasedMarking, UK cabinet ministers demoexercise showing feedback,University College, Tony Gardner-MedwinDrug Chart Errors and Omissions,Medicines Administration Assessment,Chesterfield Royal Hospital
    • 9. MCQs: High Stakes AssessmentDMW UOC Open Seminar May 2013Example of practice Thinking SkillsAssessment" (TSA) question,Admissions Interview,Cambridge Assessment, Steve LayExample of practice Thinking SkillsAssessment" (TSA) feedback,Admissions Interview,Cambridge Assessment, Steve Lay
    • 10. Scaffolding and High Stakes assessment• Math for Science• Tutor less course• Competency led• No point to cheat• Web home exam• Invigilation technologiesDMW UOC Open Seminar May 2013
    • 11. Self diagnosis• Basic IT skills, firstyear med students(Sieber, 2009)• Competency basedtesting• Repeating tests forrevision• Enables remedialinterventionDMW UOC Open Seminar May 2013
    • 12. DMW UOC Open Seminar May 2013Characteristics DescriptorAuthentic Involving real-world knowledge and skillsPersonalised Tailored to the knowledge, skills and interests of eachstudentNegotiated Agreed between the learner and the teacherEngaging Involving the personal interests of the studentsRecognise existing skills Willing to accredit the student’s existing workDeep Assessing deep knowledge – not memorizationProblem oriented Original tasks requiring genuine problem solving skillsCollaboratively produced Produced in partnership with fellow studentsPeer and self assessed Involving self reflection and peer reviewTool supported Encouraging the use of ICTElliott’s characteristics of Assessment 2.0 activities
    • 13. Authentic Assessment: Building e-portfolioson a chef’s courseDMW UOC Open Seminar May 2013food preparation for e-portfolio, ModernApprenticeship in Hospitality and Catering,West Suffolk College, Mike MulvihillEvidence of food preparation skill for e-portfolio, Modern Apprenticeship inHospitality and Catering, West SuffolkCollege, Mike Mulvihill
    • 14. Peer Assessment and the WebPA Tool• Loughborough (Loddington et al,2009)• Self assess and peer assesswith given criteria• Group mark awarded by tutor• Students rated:• More timely feedback• Reflection• Fair rewards for hard work• Staff rated:• Time savings• Administrative gains• Automatic calculation• Students have faith in theadministrative systemDMW UOC Open Seminar May 2013
    • 15. Mobile Technologies and Assessment• MCQs ,PDAsValdiva &Nussbaum(2009)• Polls,instant surveys• Simpson & Oliver(2007)• Draper (2009) EVSDMW UOC Open Seminar May 2013
    • 16. DMW UOC Open Seminar May 2013Gains from Formative Assessment• Mean effect size on standardised tests between 0.4 to0.7 (Black & Williams, 1998)• Particularly effective for students who have not donewell at schoolhttp://kn.open.ac.uk/document.cfm?docid=10817• Can keep students to timescale and motivate them• How can we support our students to become morereflective learners and engage in formative assessmenttasks?
    • 17. DMW UOC Open Seminar May 2013Collaborative formative assessment withGlobal WarmingDMW, Institute of Educational Technology, September 1997DMW, Institute of Educational Technology, September 1997
    • 18. DMW UOC Open Seminar May 2013Global Warming
    • 19. DMW UOC Open Seminar May 2013Next: ‘Yoked’ apps via BuddySpaceStudent AStudent B(‘yoked’, butwithout fullscreen sharingrequired!)
    • 20. DMW UOC Open Seminar May 2013Global Warming: Simlink Presentation
    • 21. LISC: Aily Fowler• Kent University ab-initio Spanish module• Large student numbers• Skills-based course• Provision of sufficient formative assessmentmeant unmanageable marking loads• Impossible to provide immediate feedback• leading to fossilisation of errorsDMW UOC Open Seminar May 2013
    • 22. The LISC solution: developed by Ali Fowler• A CALL system designed to enable studentsto:• Independentlypractise sentencetranslation• Receive immediate(and robust)feedback on allerrors• Attendimmediately to thefeedback (beforefossilisation canoccur)DMW UOC Open Seminar May 2013
    • 23. How is the final mark arrived at in theLISC System?• The two submissions are unequally weighted• Best to give more weight to the firstattempt• since this ensures that studentsgive careful consideration to theconstruction of their first answer• but can improve their mark byrefining the answer• The marks ratio can vary (dependingon assessment/feedback type)• the more information given in thefeedback, the lower the weightthe second mark should carryDMW UOC Open Seminar May 2013
    • 24. Heuristics for the final mark• If the ratio is skewed too farin favour of the firstattempt…• students are lessinclined to try hard tocorrect non-perfectanswers• If the ratio is skewed too farin favour of the secondattempt…• students exhibit lesscare over theconstruction of theirinitial answerDMW UOC Open Seminar May 2013
    • 25. What about emotional support in thefeedback?• Difficult at times to receive written feedback• Not just a cognitive response• How can Bales help?DMW UOC Open Seminar May 2013
    • 26. DMW UOC Open Seminar May 2013Coding into Categories• Bales analysis• Psychology 1950s• Analyses talk• Includes socio-emotive categories• Flander’s (1970)categories inappropriate as alsoincludes classroom control
    • 27. DMW UOC Open Seminar May 2013Bales Categories• Four main groupings• A. Positive reactions; agreeing and boosting the otherperson• B. Directing/teaching• C. Questions: requesting information, clarification etc• D. Negative reactions: disagreement
    • 28. DMW UOC Open Seminar May 2013Coding the tutor commentsCategories Specific ExamplesPositive ReactionsA1A2A31. Shows solidarity2. Shows tension release3. Shows agreementJokes, gives help, rewards othersLaughs, shows satisfactionUnderstands, concurs, complies, passively acceptsAttempted AnswersB1B2B34. Gives suggestion5. Gives opinion6. Gives informationDirects, proposes, controlsEvaluates, analyses, expresses feelings or wishesOrients, repeats, clarifies, confirmsQuestionsC1C2C37. Asks for information8. Asks for opinion9. Asks for suggestionRequests orientation, repetition, confirmation, clarificationRequests evaluation, analysis, expression of feeling orwishesRequests directions, proposalsNegative ReactionsD1D2D310. Shows disagreement11. Shows tension12. Shows antagonismPassively rejects, resorts to formality, withholds helpAsks for help, withdrawsDeflates others, defends or asserts selfBales’ Interaction Process
    • 29. DMW UOC Open Seminar May 2013Identifying trends: H8010 5 10 15 20 25A Pass 1A Pass 2A Pass 3A Pass 4B Pass 1B Pass 2B Pass 3B Pass 4C Pass 1C Pass 2C Pass 3C Pass 4D Pass 1D Pass 2D Pass 3D Pass 4BalesInteractionalCategoriesateachPassLlevelNumber ofIincidencesGraph to show conflated Bale’s categories against mean number of incidences in H801 scripts
    • 30. DMW UOC Open Seminar May 2013Identifying trends: H8015.9617.135.731.61ABCDPie Chart to show the mean number of incidences per pass perconflated Bales Interactional Category for all four levels ofpass in H801 scriptsKey:A = Positive reactionsB = ResponsesC = QuestionsD = Negative reactions
    • 31. DMW UOC Open Seminar May 2013Identifying trendsABCDABCDABCDPie Charts to show the mean number of incidences per conflated Bales Interactional Categoryfor ‘Pass 1’ and ‘Pass 4’ in the following courses:Key:A = Positive reactions C = Questions B = Responses D = Negative reactionsPass 4Pass 1B820 S103ABCDABCDABCDH801
    • 32. DMW UOC Open Seminar May 2013What is Open Mentor?• “An open source mentoring tool for tutors”• “Open source” = free and easy to use, andto embed in an institutions infrastructureand working practices• “mentoring” = designed to help peoplelearn how to give feedback effectively,through reflection and social networks• “tutors” = primarily intended for teachingstaff, but with clear applications for thoseinvolved in quality
    • 33. DMW UOC Open Seminar May 2013How Open Mentor handles comments• “Good work”• “Yes, well done”• “Yes, but is this useful?”• “Can you explain what youmean”• “This does not follow”• A = positive reactions• A = positive reactions• B = attempted answers, andnot a positive reaction• C = questions• D = negative reactions
    • 34. DMW UOC Open Seminar May 2013Explaining OpenMentor’s Rules• Four categories• A – Positive Reactions• B – Attempted Answers• C – Questions• D – Negative Reactions
    • 35. DMW UOC Open Seminar May 2013‘A’ - Positive ReactionsCategory Examples of Rules Examples of commentsA - Positive Reactions1. Shows solidarity A1 ...excellent... Excellent Conclusions.A1 ...(good|comprehensive)... Good, you are drawing on hard facts here.A1 ...nicely... Very nicely stated. Your analysis is thorough and yourconclusions consistent regarding the attractiveness of the budgetairline sector. This is a good example of critical thinking.A1 ...well presented... Very well presented diagram with interesting information.A1 ...effective use... Effective use of the case material here.A1 …well (structured|stated)… Report very well structured.A1 ...(well|clear)(ly)*(structured|structure|summary|summarised|presented|presentation)...The corporate vs. business unit strategy is well presented andnicely tied to strategies.A1 ...reasonable.... A reasonable structure as listed in your table of contents.A1 ...useful point(s)... Generally useful points in this section.2. Shows tension release A2 ... a helpful...A2 …(thanks|thank you)…3. Shows agreement A3 ...yes... Yes, the intellectual reactions are both real.A3 ...indeed... Indeed – if it has one basic strategy it is surely differentiation,though it still has to control costs.4. Praise then direction A4 good...but... Good model, good quote, but be careful about what industry youanalyse ??
    • 36. DMW UOC Open Seminar May 2013‘B’ Attempted AnswersCategory Rules Examples of commentsB - Attempted AnswersB4 …perhaps… Perhaps even better here to explain the link in your mind between"analysis of stratgeies" and "strategic issues".B4 …requires…B4 …take care… Take care with your STEP analysis not to make it too industryfocussed.B4 …useful to… Innovation is closely linked to structure and culture- it would beuseful to see some book 6-8 concepts here too.B4 …you (might|ought)… You ought to have explicitly stated these.B4 Don’t|never … Don’t introduce new frameworks just for the sake of it in theconclusion. The conclusion should be pulling together what wentbefore.B4 Please (see|refer to|look at)... Please make sure to read and understand the question correctly5. Gives opinion B5 I (am|think)... I think I can see where you are going, though a numbered reportformat might have demonstrated the approach betterB5 This is.... this is an introduction rather than a “summary”B5 ...sounds...like... This sounds as if it could be very popular!!B5 ...not sure... I am not sure about the balance between the environmentalanalysis and the review of the resources, capabilities (power,culture, structures and systems) as raised in the question.B5 I (thought|agree|suggest)… I thought it was because they did not need any external input andsaw a significant market sector they could address themselves.B5 I (do|don’t) think… I don’t think this exercise has helped to develop your analysis. Ialso think that the development of the perspectives is superficial6. Gives information B6 …(demonstrates|shows) this….B6 Also… Also, cross link to Leadership issues, Pettigrew on StrategicThinking tooB6 ...Q1... etc Q1 = 59/100
    • 37. DMW UOC Open Seminar May 2013Is the rule set generic?Comments Classified from Test Data0102030405060708090100B820 TestSet 1B820 TestSet 2B820 TestSet 3A850 M878 S809Test Data Set%ofcommentsclassified
    • 38. DMW UOC Open Seminar May2013
    • 39. DMW UOC Open Seminar May2013
    • 40. OpenMentor Transfer: JISC funded• JISC funded project• Transfer OpenMentortechnology to King’s andSouthampton• What changes areneeded for crossinstitutional use?• Identify strengths andlimitations of OM fortraining tutorsDMW UOC Open Seminar May 2013
    • 41. Transferring OM to other HEIsDMW UOC Open Seminar May 2013• Transferred to Southampton and KingsLondon• Participating Tutors given face to face training• Kings College:• 3 tutors.• 25. learning experts at TEL forumgave feedback after a demonstration• Southampton• 3 tutors.• Interviews and questionnaire• Open University• 3 distance education tutors• Questionnaire and epistolaryinterviews• 113 students in a Masters coursefocussing on Innovation in eLearningand 5 tutors.
    • 42. Lessons learned after completion of firstcycle of trials• Open Mentor’s theoretical framework wasrobust enough to facilitate and encouragedialogue and reflective activities• Tutors positive about the system’s functionsto support provision of feedback• Suggestions for change• a module for user authentication andmanagement• the development of OM reports to help tutorsto progress towards the ideal ‘state’ offeedback provided.• used for training purposes as an academicdevelopment tool.• Our contact details, blog and references -http://omtetra.ecs.soton.ac.uk/wordpress/DMW UOC Open Seminar May 2013
    • 43. What can we learn from modellingtutors marking to construct a formativee-assessment tool?• Open Comment project builds on the work ofOpenMentor• Free text entry for History and Philosophy students• Immediate feedback (in context) to students• Influenced by ELIZA (Weizenbaum, 1963)DMW UOC Open Seminar May 2013
    • 44. Open Commentaddresses the problemof free text entry• Automated formativeassessment tool• Free text entry for students• Automated feedback andguidance• Open questions, divergentassessment• No marks awarded• For use by Arts FacultyDMW UOC Open Seminar May 2013
    • 45. Causal models of explanation• First step:• Identification of question types where studentsexhibit causal reasoning• Looked for questions with:• Salient variables• Description of these variables• Identification of trends• Identification of relationship between the variablesi.e. causalityDMW UOC Open Seminar May 2013
    • 46. DMW UOC Open Seminar May 2013
    • 47. DMW UOC Open Seminar May 2013
    • 48. DMW UOC Open Seminar May 2013
    • 49. Praise for effort and not just ability• Praise for ability per secan hinder learning(Mueller & Dweck,1998)• Praise = being clever• Negative feedback nowwithout ability• Disempowering anddemoralisingDMW UOC Open Seminar May 2013
    • 50. Mueller & Dweck (1998)• Raven’s Matrices (IQ)• First test pupils praise either for effort or ability• Second test most difficult• Third test medium difficulty. Score up 1 points for pupilspraised for effort. Down 1 point abilityDMW UOC Open Seminar May 2013
    • 51. Stages of analysis by computer of students’ free textentry for Open Comment: advice with respect tocontent (socio-emotional support stylised example)• STAGE 1a: DETECT ERRORS E.g. Incorrect dates,facts. (Incorrect inferences and causality is dealt withbelow)• Instead of concentrating on X, think about Y in order toanswer this question Recognise effort (Dweck) andencourage to have another go• You have done well to start answering this question butperhaps you misunderstood it. Instead of thinking aboutX which did not…….. Consider YDMW UOC Open Seminar May 2013
    • 52. DMW UOC Open Seminar May 2013Computer analysis continued• STAGE 2a: REVEAL FIRSTOMISSION• Consider the role of Z in youranswer Praise what is correctand point out what is missingGood but now consider the roleX plays in your answer• STAGE 2b: REVEAL SECONDOMISSION• Consider the role of P in youranswer Praise what is correctand point out what is missingYes but also consider P. Wouldit have produced the same resultif P is neglected?
    • 53. DMW UOC Open Seminar May 2013Final stages of analysis• STAGE 3:REQUESTCLARIFICATION OF KEY POINT 1• STAGE 4:REQUEST FURTHERANALYSIS OF KEY POINT1(Stages 3 and 4 repeated with allthe key points)• STAGE 5:REQUEST THEINFERENCE FROM THEANALYSIS OF KEY POINT 1 IF ITIS MISSING• STAGE 6:REQUEST THEINFERENCE FROM THEANALYSIS OF KEY POINT 1 IF ITIS NOT COMPLETE• STAGE 7:CHECK THE CAUSALITY• STAGE 8:REQUEST ALL THECAUSAL FACTORS AREWEIGHTED
    • 54. Where are we now?• Opening up with Open Source• Moving towards vision and not losing sight of it throughtool adaptation• More work to do for Arts• Open Comment - pedagogical model open to test• Feedback• Changing pedagogy• Another handle on misconceptionsDMW UOC Open Seminar May 2013
    • 55. DMW UOC Open Seminar May 2013Open Comment drivers for reflection• Students are able to find facts similar to X• Know how X might be disputed• Are able to make predictions about X• Know how to use X in an argument• Know how far X can be pushed• Supported with tools and strategies for effort
    • 56. SAFeSEAProfessor Denise WhitelockProfessor John RichardsonProfessor Stephen PulmanAn automatedtoolsupportingonline writingandassessmentof essaysprovidingaccuratetargetedfeedbackSAFeSEA: Supportive Automated Feedback forShort Essay AnswersDMW UOC Open Seminar May 2013
    • 57. About SAFeSEA• Effect of summarisation• What are the beneficialfactors?• Correlate measures oflearner activity and essayimprovement• Effect of hints• http://www.open.ac.uk/iet/main/research-DMW UOC Open Seminar May 2013
    • 58. OpenEssayist: SAFeSEA Web application forsummarisation-based formative feedbackDMW UOC May 2013
    • 59. Key words and phrases visualized in the essay context. Sentences inlight-grey (green) background are key sentences as extracted by theEssayAnalyser (the number at the start of the sentence indicates itskey-ness ranking); bigrams are indicated in bold (red) and boxed.DMW UOC May 2013
    • 60. The structural elements of the essay can be used jointly withthe key word extraction to highlight relevant information withinspecific parts of the essay, here the introduction (and theassignment question)DMW UOC May 2013
    • 61. Key words and phrases as separate listsDMW UOC May 2013
    • 62. Dispersion of key words across the essayhttp://www.open.ac.uk/iet/main/research-scholarship/research-projecDMW UOC May 2013
    • 63. Can we find ways of using graph visualizationtechniques on the key words and key sentences, tomake them helpful and meaningful to students?DMW UOC May 2013
    • 64. SAFeSEA• Support for essaywriting• Shape landscape ofeLearning andLearning Analytics• Improves thestudent experience• Support advances inNLPDMW UOC Open Seminar May 2013
    • 65. Feedback• Students must decodefeedback and then act on itBoud (2000)• Students must have theopportunity to act onfeedback Sadler (1989)• Gauging efficacy throughactionDMW UOC Open Seminar May 2013
    • 66. Badge System: MozillaDMW UOC Open Seminar May 2013
    • 67. DMW UOC Open Seminar May 2013Characteristics DescriptorAuthentic Involving real-world knowledge and skillsPersonalised Tailored to the knowledge, skills and interests of eachstudentNegotiated Agreed between the learner and the teacherEngaging Involving the personal interests of the studentsRecognise existing skills Willing to accredit the student’s existing workDeep Assessing deep knowledge – not memorizationProblem oriented Original tasks requiring genuine problem solving skillsCollaboratively produced Produced in partnership with fellow studentsPeer and self assessed Involving self reflection and peer reviewTool supported Encouraging the use of ICTElliott’s characteristics of Assessment 2.0 activitiesA d v i c e f o r A c t i o n
    • 68. The 4Ts PyramidDMW UOC Open Seminar May 2013Tool DevelopmentTraining of staffTransformationTasksTransferLearning
    • 69. National Union of Students’ Principles of EffectiveAssessment Times Higher Education, 29thJanuary2009• Should be for learning, not simply of learning• Should be reliable, valid, fair and consistent• Should consist of effective and constructive feedback• Should be innovative and have the capacity to inspireand motivate.• Should be conducted throughout the course, ratherthan being positioned as a final event• Should develop key skills such as peer and reflectiveassessmentDMW UOC Open Seminar May 2013
    • 70. Final thoughts• There is a growing consensus in the field of assessment thattimes are changing and that assessment needs to becomemore embedded/central in the teaching learning cycle(Hatzipanagos & Rochon 2011).• Our project provides another phase in this type of researchwhere the balance of socio emotive content contained infeedback cannot be ignored (Draper, 2009).• Feedback that encourages the student to actively changetheir ideas and ways of organising their answers anddiscourse within a given subject domain is what is requiredand advocated by Whitelock (2011) as “advice for action”.DMW UOC Open Seminar May 2013
    • 71. “Advice for Action”, Whitelock (2011)• Helping studentsfind out what they donot know and howto remedy thesituation can avoidthe trauma ofassessment• Are we on the waywith new e-tools?DMW UOC Open Seminar May 2013
    • 72. DMW UOC Open Seminar May 2013
    • 73. References• Van Labeke, N., Whitelock, D., Field, D., Pulman, S. & Richardson, J.(2013) ‘OpenEssayist: Extractive Summarisation & Formative Assessmentof Free-Text Essays’. Workshop on Discourse-Centric Learning Analytics,3rdConference on Learning Analytics and Knowledge (LAK 2013), Leuven,Belgium• Whitelock, D., Gilbert, L., Hatzipanagos, S., Watt, S., Zhang, P., Gillary, P.& Saucedo, A. (2012) Supporting tutors with their feedback usingOpenMentor in three different UK Universities. 10thInternationalConference on Computer Based Learning in Science, CBLIS 2012,Barcelona, Spain. 26-29 June 2012.• Whitelock, D., Gilbert, L. & Gale, V. (2011) ‘Technology-EnhancedAssessment and Feedback: How is evidence-based literature informingpractice?’ International Computer Assisted Assessment Conference,DeVere Grand Harbour Hotel, Southampton, 5/6 July 2011. http://caaconference.co.uk/wp-content/uploads/WhitelockB-CAA2011.pdf• Whitelock, D. (2010) Activating Assessment for Learning: are we on theway with Web 2.0? In M.J.W. Lee & C. McLoughlin (Eds.) Web 2.0-Based-E-Learning: Applying Social Informatics for Tertiary Teaching.IGI Global. pp. 319–342.DMW UOC Open Seminar May 2013
    • 74. References (2)• Whitelock, D. & Watt, S. (2008) ‘Putting Pedagogy inthe driving seat with Open Comment: an open sourceformative assessment feedback and guidance tool forHistory Students.’ CAA Conference 2008,Loughborough University, 8/9 July 2008, edited byFarzana Khandia pp. 347-356 ISBN 0-9539572-7-6http://kn.open.ac.uk/public/document.cfm?docid=11638• Whitelock, D. & Watt, S. (2007) e-Assessment: How anwe support tutors with their marking of electronicallysubmitted assignments? Ad-Lib Journal forContinuing Liberal Adult Education, Issue 32, March2007 pp 7-9, ISSN 1361-6323.DMW UOC Open Seminar May 2013
    • 75. References (3)• Whitelock, D. (2006) Electronic Assessment: Marking,Monitoring and Mediating Learning. In McAndrew, P.and Jones, A. (eds) Interactions, Objects andOutcomes in learning. Special Issue of InternationalJournal of Learning Technology. Vol. 2, Nos 2/3 pp264-276.• Whitelock, D. & Watt, S. (2006) OpenMentor: openingtutors eyes to the written support given to students intheir assignments. JISC Conference 2006,Information & Communication Technology inEducation and Research. International ConferenceCentre, Birmingham, 14 March 2006.DMW UOC Open Seminar May 2013

    ×