• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Learning and Educational Analytics
 

Learning and Educational Analytics

on

  • 1,330 views

Learning analytics and Moodle: So much we could measure, but what do we want to measure? A presentation to the USQ Math and Sciences Community of Practice May 2013

Learning analytics and Moodle: So much we could measure, but what do we want to measure? A presentation to the USQ Math and Sciences Community of Practice May 2013

Statistics

Views

Total Views
1,330
Views on SlideShare
1,236
Embed Views
94

Actions

Likes
3
Downloads
31
Comments
0

3 Embeds 94

http://www.scoop.it 89
https://twitter.com 3
http://pinterest.com 2

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

CC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Source: McKinsey Report: Big Data: The Next Frontier for Innovation, Competition, and Productivity
  • Suthers, D. D., & Rosen, D. (2011). A unified framework for multi-level analysis of distributed learning Proceedings of the First International Conference on Learning Analytics & Knowledge, Banff, Alberta, February 27-March 1, 2011.

Learning and Educational Analytics Learning and Educational Analytics Presentation Transcript

  • Learning analytics and Moodle: So muchwe could measure, but what do wewant to measure?Associate Professor Michael Sankey, EdDDirector, Learning Environments and Media
  • SAF Embarking on a project that involves: Establishing common codebase across all our 3Moodle environments fully aligned with Mahara Extending the functionality of eAssessment withinMoodle (replacing EASE, CMA, EMS) Establishing a suite repositories in Equella Create new digital rights management workflow Enhance discoverability Establish learning analytics across L&T systems Align help resources to new regime and Provide PD
  • Learning analytics for our systems Which systems? What tools? How big do we want the data? Is it just our USQ systems? What do we want to know? How do we want to use this data? Who gets involved? Who makes the decisions? I‟ll come back to these questions at the end But first some background…
  • Siemens, G. 2013. Structure and logic of analytics. Available from http://www.learninganalytics.net/
  • Three levels within the institutionLearningAnalyticsEducationalData miningEducationalAnalyticsAll focus on the learner to some degree, either as anindividual or in context to the institutionEducational Analytics
  • Academic and Learning Analyticshttp://www.educause.edu/ero/article/penetrating-fog-analytics-learning-and-education
  • Siemens, G. 2013. Structure and logic of analytics. Available from http://www.learninganalytics.net/
  • Developing an analytics frameworkIdentify Tier1 dataData thatexists (technicalanalytical)Form Tier 2dataIssuesintelligenceCreate Tier 3dataContextintelligenceAdapted from Terenzini, 2013; Padró & Frederiks, 2013SBMI and Peoplesoft• AUSSE/UES• Grades• Graduation rate• Persistence• Retention• Student demographics• Student satisfaction data• Transfer ratesL&T Systems & RightNow• Co-curricular studentengagement activities data• Course interactions• Systems data• Learning Centre data• Other student learningsupport activities dataInstitutional emphasis for data collection & analysis:Customer service (transactional) or Student developmentNational policy preference
  • Learning analytics is themeasurement, collection, analysis and reportingof data about learners and their contexts, forpurposes of understanding and optimizinglearning and the environments in which itoccurs.
  • http://www.solaresearch.org/
  • Where did it originate SoLAR exists to ensure that thereis an expansive, transformativevision for what analytics mightmean for the future of learning andto promote a very critical discoursethat is non-partisan, and groundedas far as possible in practice-basedresearch. SoLAR is a non-profitorganization. Incorporation iscurrently underway.
  • Scope At what level do we pitch? LMS data analytics Easier to implement Limited data so not the whole picture Logs in Moodle are good, but not comprehensive The Learning ecosystem analytics Complex – needs an open standards model andpotentially access to external repositories Much more holistic picture In our case, Mahara, Equella, BBCollaborate, EASE, Library, lecture capture, etc
  •  LMS is one of the primary providers of thedata, since it preserves digital footprints ofstudent interactions which can be mined forpatterns of learning behaviour and teachingpractice, and this allows for benchmarking andthe monitoring of institutional qualityinitiatives.
  •  Predictive analysis indicates that somestudents are higher risk than others; for thosewho are first in family or from a low socio-economic background, the risk of failureincreases. There is a question as to whatconstitutes quality learning. Analytics is a keyplayer in this field, given that it provides avast amount of data and techniques for itsanalysis. Learners and their context are vitallyimportant in this discussion.
  • How big is the data? Typically in our Moodle we generate between50-100 million log records per yearWhat is the Aim Finding out we have a problem before it fullymanifests If we accept this it gives us a framework toconsider our options
  • An perspective on big data OUA As with other fields the key questions to ask are:what do you want to know; why do you want to know it and whatare you going to do next? Analytics makes sound educational and financial sense, itincreases retention and encourages students to enrol again. The biggest factor in student‟s retention is intent of purpose; whyare they doing what they want to do? There are things that canbe done to aid them to achieve their intent of purpose. Previous education is the single biggest predictor of success. Sothe question becomes, what supports can we put in place forthose without this. E.g.: Invigilated exams in a student‟s first unit decreases their chance ofsuccess. This raises learning design issues for our introductory units. Other data shows that older students and female students are more likelyto succeed in their first Course. Coaching and contact are also predictors of retention and success, alongwith preparatory units.
  • QUT study – Wendy Harper Overall, the conclusions she drew were: The key predictor of success in a unit is GPA. The number of hits and days visiting a QUT Blackboard unitsite also predicts unit success. Students who are likely to fail a unit often do not engage earlyenough with their online environment. Students who fail a unit often have alternating high peaks ofengagement and total disengagement. Factors such as gender, international or domesticenrolment, and age make very little difference to studentbehaviour in online units. „Narrowly failing‟ students often perform a much greateramount of online activities in the unit they are struggling in. „Narrowly failing‟ students often show high engagementaround early assessment pieces, but this drops off as thesemester progresses.
  • “Learning and knowledge creation is oftendistributed across multiple media and sites innetworked environments. Traces of such activitymay be fragmented across multiple logs andmay not match analytic needs. As a result, thecoherence of distributed interaction andemergent phenomena are analyticallycloaked”Suthers, Rosen, 2011
  • Strategy Planning &resourcesallocationMetrics &toolsCapacitydevelopmentSystemicchangeData inventory Data/AnalyticsteamAnalytics goals& target areasFaculty/StaffPDCoursemodels?Role of data(Problem oropportunity)Data sources Educator-controlled toolsStudent access Self-directedlearningStakeholders(IR, Academic,Admin)Budget Enterprise tools LearningdesignAutomateddiscoveryAccess Priorities Iterativedevelopment ofalgorithmsProcessmapping andevaluationStudent modelsGovernance Stages ofdeploymentVisualization IntelligentcurriculumCompliance PolicydevelopmentAthabascas approachSiemens, G. 2013. Structure and logic of analytics. Available from http://www.learninganalytics.net/
  • Ethics The ethical professional Respecting the rights of students Stepping in to provide pastoral support advise about risks of failure advise about increasing chances of success Research ethics Risk minimisation Needed for publication Issues with: accessing „databanks‟ anonymity
  • Privacy Issues The Greater Good vs Big Brother Teacher: “It‟s unethical not to tell a student they are at riskof failing” Student: “I don‟t want you to be looking over my shoulder. Ican make my own choices about my study.” Reports to staff vs dashboards for students
  • Usefulness of analytics What is the question to which analytics is theanswer? Don‟t just buy a product Learning analytics are just indicators of behaviour They don‟t explain behaviour A single source of analytic data is probablyinsufficient Combine data into a data warehouseTime to look at some different options
  • The Engagement Analytics block http://docs.moodle.org/22/en/report/analytics/index It provides information about student progress against a range ofindicators. It provides feedback on the level of "engagement" of astudent. “Engagement" refers to activities which have beenidentified by current research to have an impact on studentsuccess in an online course. The plugin was developed as part of a NetSpot Innovation Fundproject by Monash University (Dr Phillip Dawson), with code byNetSpot developers (Ashley Holman & Adam Olley). It is a block that teachers can add to their Moodle course that willprovide them with a quick graphical snapshot of which studentsare at risk. It is important to note that the purpose of the plugin is to provideteachers with information only, it does not automatically take anyaction based on the indicators e.g. NO email or notification is sentto students automatically. If desired the teacher would follow up on the informationthemselves, based on what they know about the student and theirother communications.
  • GISMO It is a visualization tool for Moodle that obtains trackingdata, transforms the data into a form convenient forprocessing, and generates graphical representations thatcan be explored and manipulated by course instructors toexamine social, cognitive, and behavioral aspects ofdistance students. It can be included in any Moodle course as side block.Since it is aimed to help instructors, this block will bevisible only to users who have the instructor role(students dont see it). Each time the Moodle cron jobs runs, GISMO fetchesstudents data from Moodle logs, and performs somestatistical calculations. The lifetime of GISMO datacorresponds to the length of time of your Moodle logs.
  • It has Accesses overview A graph reporting the students accesses to the course. Accesses to the course A graph reporting accesses for each student in a timeline. Accesses overview on resources A graph reporting the number of accesses made by the students to the resourcesof the course Assignments overview A graph reporting the submission of assignments. Color is mapped to the gradeassigned by the teacher. Quizzes overview A graph reporting the submission of quizzes. Color is mapped to the grade. Resources accesses overview A graph reporting an overview of the number of accesses to resources of thecourse. Resources accessed by a particular student A graph reporting an overview of the students accesses to resources on a timeline. Students accesses to resources A graph reporting, for each student, the number of accesses to resources of thecourse.
  • A graph reporting the studentsaccesses to the course.
  • A graph reporting accesses foreach student in a timeline.
  • A graph reporting the number ofaccesses made by the students tothe resources of the course
  • A graph reporting the submission ofassignments. Color is mapped to thegrade assigned by the teacher.
  • A graph reporting an overview of thenumber of accesses to resources ofthe course.
  •  Macquarie, UNSW and Netspot are working ona new student centric analytics tool for Moodle
  • SNAPPhttp://www.snappvis.org/
  • SNAPP The Social Networks Adapting PedagogicalPractice (SNAPP) tool performs real-time socialnetwork analysis and visualization ofdiscussion forum activity within popularcommercial and open source LearningManagement Systems (LMS). It essentially serves as a diagnosticinstrument, allowing teaching staff to evaluatestudent behavioral patterns against learningactivity design objectives and intervene asrequired a timely manner.Dawson, S., Macfadyen, L., Lockyer, L., & Mazzochi-Jones, D. (2011).Using Social Network Metrics to Assess the Effectiveness of Broad-Based Admission Practices.Australasian Journal of Educational Technology, 27(1), 16-27.Also available from: http://www.snappvis.org/?page_id=4
  • Info from Shane Dawson (UniSA) Social interaction is one of the most importantof student behaviours and predictors of success. Student networks are the “single most potent sourceof influence.” The tool provides a visualisation of socialnetworking. Different patterns are available to theindividual, and mechanics which allow the data to bemanipulated for different purposes. It demonstrates that with students, like responds tolike; they form self-regulating structures. It may be possible to manipulate group structures so that high-performing students can assist low-performing ones. It may also be possible to direct teachers‟ time to areas of need.Dawson, S., Macfadyen, L., Lockyer, L., & Mazzochi-Jones, D. (2011).Using Social Network Metrics to Assess the Effectiveness of Broad-Based Admission Practices.Australasian Journal of Educational Technology, 27(1), 16-27.
  • Learning Analyticsfor UnderstandingLow 10% student located in networkStudents with agrade >75% < 90%Social Network Analysis
  • Available from http://grsshopper.downes.ca/about.htm
  • Available from http://grsshopper.downes.ca/about.htm
  • Refining the signals from theTwitter feed http://mashe.hawksey.info/2012/11/cfhe12-analysis-summary-of-twitter-activity/
  • BIM David Jones FoE BIM (BAM into Moodle). BAM = Blog Aggregation Management. BIM is a Moodle module that supports an activity where: Each student registers an individual external web feed. The feed mightbe generated by a blog, twitter or any other tool that produces a webfeed. Its the students choice what they use. Each student uses that external feed to respond to a set ofquestions. Currently, those questions usually encourage thestudent in reflecting on their learning, often in the form of areflective journal. There is no need to have a set of questions. it maintains a copy of each students web feed, and attempts toallocate student posts to the questions. it allows different teachers to track, manage and mark posts fordifferent groups of students. Allows a coordinating teacher to allocate teaching staff todifferent groups, track their marking progress and all studentactivity. Student results can be sent to the Moodle gradebook.Jones, D. 2013. BIM – Feed Aggregation. Available from http://davidtjones.wordpress.com/research/bam-blog-aggregation-management/
  •  ACODE prepared a literature review containing165 categorised references in an Endnotelibrary
  • Learning analytics for our systemsThe big Q Your big AnswerWhich systems?What tools?How big do wewant the data?Is it just our USQsystems?What do we wantto know?How do we wantto use this data?Who getsinvolved?Who makes thedecisions?