Your SlideShare is downloading. ×
Learning Analytics: What it is, where we are, and where we could go
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Learning Analytics: What it is, where we are, and where we could go

477
views

Published on

Talk given at the Computers and Learning Research Group (CALRG) annual conference, 12 June 2013, at The Open University, UK. …

Talk given at the Computers and Learning Research Group (CALRG) annual conference, 12 June 2013, at The Open University, UK.
This presentation briefly reviews learning analytics, using some key examples. It then assesses what the OU is doing, and then sets out some ideas for what the OU could do in future to harness the potential of data about our learners to improve their learning.

Published in: Technology, Sports

0 Comments
4 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
477
On Slideshare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
0
Comments
0
Likes
4
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Data mining, academic analytics, learner analytics – focus here is on the learning, not the management and administration of learning International profile
  • We’ve always done this! Just more and faster – Marx quoting Hegel: a sufficiently large quantitative change is a qualitative one.
  • Builds on Neil Mercer, content analysis
  • Speed and length of cycles: instant feedback as you learn, through to govt policy
  • Surveillance. Openness and transparency key.
  • Raiders of the Lost Ark – the ark is someplace safe, being studied by top people – in a TOP SECRET crate just like millions of others in a warehouse
  • Predict students at risk of non-completion OU data, mainly VLE Clickstream predictive – if your usage drops (absolute level not so good) Not submitting a TMA is predictive, low mark too Easier to predict non-completion earlier than later (not submitting the first TMA is a worse sign than not submitting the last)
  • Want faster cycles, larger scale
  • Speed, scale, quality of response Get it to the learners and teachers
  • Coming soon!
  • Transcript

    • 1. Learning Analytics:What it is, where we are …Doug ClowCALRG Conference, 12 June 2013This work by Doug Clow is licensed under a Creative Commons Attribution 3.0 Unported License.… and where we could go.
    • 2. What it isWhere we areWhere we could go
    • 3. Quick Quiz:Learning Analyticsa) This is the first time I’ve heard of itb) I know it’s one of the latest buzzwordsc) I’ve heard of some projectsd) I’m doing some projectse) I have to keep turning down keynoteinvitationson this subjectcc licensed ( BY ) flickr photo by Swaminathan: http://flickr.com/photos/araswami/2168316216/
    • 4. What it is
    • 5. What is learning analytics?• the measurement, collection, analysis and reporting ofdata about learners and their contexts, for purposes ofunderstanding and optimising learning and theenvironments in which it occurs– First International Conference on Learning Analytics And Knowledge (LAK11), Banff, Alberta,Feb 27-Mar 1 2011.cc licensed ( BY ) flickr photo by Cris: http://flickr.com/photos/chrismatos/6917786197/
    • 6. • An abundance of data• Qualitative and quantitative• To improve learning
    • 7. • Learning analytics• Learner analytics• Academic analytics• Business Intelligence• Educational Data Mining(cc) gareth1953 http://www.flickr.com/photos/gareth1953/5477477947/• Web analytics• Social network analysis• Predictive modeling• Latent semantic analysis• … and more!
    • 8. • Predictive modeling, datamining (Blackboard)• Fit students in to one of three risk groups=> traffic light• Trigger for intervention emails• Consistent grade performance improvement• Dramatic retention improvements
    • 9. Social Network Analysis• Social Networks Adapting Pedagogic Practice• Network visualisations of forum activity data from VLE• See patterns• Spot central anddisconnected• Identify at-risk• Improve teaching
    • 10. Content/semantic analysis• Lárusson and White, 2012
    • 11. • Santos, Govaerts,Verbert and Duval,2012Usage tracking
    • 12. Social learning analytics• Ferguson and Buckingham Shum, 2012
    • 13. Clow, LAK12, 2012
    • 14. (Copyright status hard to verify)
    • 15. Where we are
    • 16. ‘Traditional’ data sources• IET Student Statistics and Surveys Team–Student data: demographics, qualificationaim, modules taken, results, etc–Student feedback data: End of ModuleSurvey and others (all change)• Information Office (Planning, Forecasts)• Marketing• VOICE• CIRCE, PLANET• CAMEL, FRODO, ISDES, PIMS, etc …cc licensed ( BY ) flickr photo by Walmart: http://flickr.com/photos/walmartcorporate/5391507982/
    • 17. New-ish sources• Learning delivery reports• Marketing analytics• Open media analytics• Web analytics (DAX) …
    • 18. VLE data
    • 19. VLE data
    • 20. IET Data Wranglers• Student feedback• VLE usage• Learning deliveryreports
    • 21. Where we could go
    • 22. Clow, LAK12, 2012
    • 23. Improving feedback(cc) Doug Clow http://dougclow.com
    • 24. Student Support Tool• Team specify the most important activities in a module–What activities are important, and when• Tool shows–Which students have done the important activities–General usage indicators, with comparisoncc licensed ( BY ) flickr photo by Mike Baird: http://flickr.com/photos/mikebaird/398077070/
    • 25. CourseSignalsfor theOUcc licensed ( BY SA ) flickr photo by Fin Fahey: http://flickr.com/photos/albedo/97586501/
    • 26. A/B Testing• Find out what works–At a smaller, more fine-grained level–In a matter of weeks, not years• Settle arguments about how to teach–With data from actual students!cc licensed ( BY ) flickr photo by LASZLO ILYES: http://flickr.com/photos/laszlo-photo/4093575863/
    • 27. Bad reasons for not testing Why we should testWe’ve never done it before. Others are doing it. And we have done it,just small-scale and unsystematically.We know what’s right already. No. We think we know what’s right.We might find out we were wrong. Better to find out sooner and improvefaster.It’ll deskill academics. It’ll help academics do even better.It’s unethical. It’s unethical not to. Ask doctors.It’ll cost too much. It’s costing us too much already.We don’t know how. There is a way to change our capacity toact: Learning.
    • 28. Good reasons for not testing What we should do about itStudents might not want to beexperimented on.Explain the benefits. Only test on peoplegiving informed consent.It’ll make things more complicated. Design tests carefully to minimisedisruption.We don’t agree about what’s agood outcome: retention, passrate, progression, attainment,improvement as a human being, ...Track multiple outcomes. Work towardsconsensus – if we can’t agree what countsas good we’re in trouble.You can ‘game’ outcomes – e.g. atoo-easy exam gets ‘good’ results.Don’t do that. Maintain quality procedures.What if we start testing and oneversion is much, much better?If the evidence is strong enough, stop thetrial early. (There are sums for this.)We don’t have the systems, or theculture, to do systematic testing.We should build them.
    • 29. UsThem
    • 30. Learning analytics at the OU:•We’re doing some interesting things•We’re going to do more•We could and should do a lot, lot moreDoug Clow@dougclowdougclow.orgdoug.clow@open.ac.ukcc licensed ( BY ) flickr photo by Vince Alongi: http://flickr.com/photos/vincealongi/2537227873/Systematic innovation & improvement… through evidence-based practice

    ×