1 introduction


Published on

The introduction to the workshop followed by discussion of Context and Task.

Part of the slides for a workshop titled "Four questions for understanding Learning Analytics" by @beerc and @djplaner

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Make the point about how the session will be run. Initially mostly theoretical, show and tell and discussion. Opportunity for some hands on work at the end.Rationale for thisDifficult to know constraints on practical activities given short time and strange locationFlexibility for attendees who want an early nightPossibility of more discussion and activities tomorrow
  • First a little about who we are and where we have come fromI’m interested in analytics because I think it can help with e-learning in many different ways, in particular and in the short term, I’d like to see analytics used to directly help teaching academics as I believe they are the key interface between the university and students.I guess I’m also amazed at just how much the mountains of data that universities collect, is actually used in meaningful ways.David and I started the Indicators project at CQUniversity back in 2008 I think and we’ve been tinkering with learning analytics ever since. I’ll show you some of the embarrassing early work in a minute.CQUniversity is what I’d describe as a typical uni in terms of its systems and its approach to systems. We use Peoplesoft as our student information system, Moodle as our LMS and we have a fairly mature business intelligence area who maintains the data warehouse. We have a centralised organisational structure so we have a central IT department, Central L&T department and so on.What would I like from this session. I’d like to hear from you. I’d love to hear how you and/or your institution is approaching learning analytics
  • The story of the indicators project along with the next few examples
  • X Axis: GradesY Axis: Student clicksOver 4030 – 4020 – 30Under 20
  • These sorts of interesting correlations are really what piqued our interest in learning analytics
  • Go around the room
  • Ask who likes or dislikes these definitions
  • At the school level indicated in George’s model, The context of each university will vary. Eg CQUniversity is almost 70% distance students now which isn’t comparable with the likes of UQ who are predominately face-to-faceThe context of each program will vary. Eg Engineering versus creative and performing arts programs.The context of each course will vary in terms of what they will required from learning analytics. Different pedagogical intents, different disciplines, different assessment strategies etc.
  • This model of analytics comes from George Siemens latest journal article giving an overview of the field of learning analytics. I recommend the article and I like this model in that it captures what passes for current understandings of learning analytics. But I also have an issue with it. To me it smacks to much of “do it to and for” and not enough of do it with.It also, for me, illustrates a criticism that Col, myself and Damien Clark our colleague have of current approaches to learning analytics which we try to capture in our own framework – called the IRAC framework. Come along to our workshop this afternoon to learn more.
  • Lets look at the average number of forum posts and replies for over 39,000 distance student course combinations.You can see the correlation between the students’ level of forum activity and their grades and this, for me, fits nicely with my confirmation bias that suggests that this should be so.
  • But if we look at the average number of forum contributions per student across all courses in a single year, you will note how significant the variation really is. So most courses are taught by different teachers who had different design philosophies and consequently they all tend to utilize the discussion forums in different ways.LABEL
  • And if we drill down even further to just a single student who has received a HD in all of the courses they have completed to date, you can see diversity in the number of forum contributions they make across their 18 courses.
  • There are a whole range of different tasks that might be supported by learning analytics. Lets have a look at some.The first and most popular is risk analytics, that is the application of learning analytics to identify students who may be struggling and at-risk of failing or dropping outAnd we are no different in that we’re looking to do something similar.
  • Everyone is considering Learning AnalyticsCQUniversity’s attrition problem
  • Desire to learn student success system
  • Social network analyticsWe’ve all probably heard about SNAPP from our good friend Shane Dawson from Uni SA.The social network diagrams can be used to identify:isolated studentsfacilitator-centric network patterns where a tutor or academic is central to the network with little interaction occurring between student participantsgroup malfunctionusers that bridge smaller clustered networks and serve as information brokersThe really clever thing about SNAPP is that it can be performed on discussion forums without the need for any changes to your backend systems. It works within the web browser on your local computer which means you don’t have to battle the IT department to have it installed.
  • Notice that GA3 (teamwork) is the least represented of all the graduate attributes on average.
  • Adaptive learning
  • So getting back to task, you can see there are many many tasks that learning analytics might be able to help with.At the individual level the tasks are quite specific and are often based on a very precise context, staff member, student, student group etcetera.Yet learning analytics implementations appear to be dominated by large scale institutional projects that simply can’t cater to the diversity of tasks that may benefit from learning analytics.
  • 1 introduction

    1. 1. Dr David Jones University of Southern Queensland Toowoomba QLD Colin Beer CQUniversity Rockhampton QLD Four questions for understanding Learning Analytics
    2. 2. Facilitators • Dr David Jones – Faculty of Education @ University of Southern Queensland – Foundation member of the Indicators project. • Colin Beer – Learning and Teaching Services @ CQUniversity – Foundation member of the Indicators project. http://indicatorsproject.wordpress.com
    3. 3. This workshop will use a research-based framework of four questions to help you: • Increase your awareness of what learning analytics is and what is currently being done (and not done) with it • Understand how insights from a range of knowledge bases can better inform learning analytics projects • Develop insights into how you can use learning analytics to complete your own task
    4. 4. How it will work • Context – Yours, ours and what we’d like to get out of this • Task – Some examples of Learning Analytics • Information? • Representation? • Affordances? • Change?
    5. 5. How it will work • Context – Yours, ours and what we’d like to get out of this • Task – Some examples of Learning Analytics • Information? • Representation? • Affordances? • Change? Theory and discussion Some hands on
    6. 6. Assumptions • Value of learning analytics is when Integrated into “tools & processes of teaching & learning” (Elias, 2011, p. 5) “provide workers with the help they need to perform certain job tasks, at the time they need that help, and in a form that will be most helpful” (Reiser, 2001, p.63)
    7. 7. http://farm3.staticflickr.com/2734/4152919570_3acdefc13e_z.jpg
    8. 8. Who we are • Colin Beer from CQUniversity (Rockhampton) • Lecturer (Educational Technology) within the Learning and Teaching Services Area • Why am I interested in Learning Analytics? • Learning Analytics activities at CQUniversity • Systems and technologies? • What I would like from this session?
    9. 9. Who we are • David Jones from USQ (Toowoomba) • Senior Lecturer within the Faculty of Education • Why am I interested in Learning Analytics? • Learning Analytics activities at CQUniversity • Systems and technologies? • What I would like from this session?
    10. 10. Where it all started • The Indicators project • LMS support • Curiosity about LMS behaviour and student results • Interesting Correlations
    11. 11. Blackboard, term 1, 2006 Learner- Content 69% Learner- Learner 20% Learner- Teacher 11% Teacher- Teacher 0%
    12. 12. Moodle, term 1, 2011 Learner- Content 78% Learner- Learner 11% Learner- Teacher 10% Teacher- Teacher 1%
    13. 13. 0 0.5 1 1.5 2 2.5 F P C D HD Number of question marks (n=273814) Number of question marks Linear (Number of question marks)
    14. 14. Your Context • Who you are and where you are from? • What is your interest in Learning Analytics? • What Learning Analytics activities are planned or underway at your institution? • What systems/technologies are potential sources of information in your institution? • What do you want from this session? Please tell us:
    15. 15. Learning analytics definitions A key concern in learning analytics is the need to use the insights gathered from the data to make interventions, to improve learning and to generate ‘actionable intelligence’ which informs appropriate interventions (Campbell, DeBlois & Oblinger 2007)
    16. 16. Learning analytics definitions “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs” (Long and Siemens 2012)
    17. 17. Learning analytics definitions “Learning analytics is the application of … Big Data techniques to improve learning” (Clow, 2013)
    18. 18. Learning analytics definitions “Learning analytics is the application of … Big Data techniques to improve learning” (Clow, 2013)
    19. 19. Some simple patterns 0.00 0.50 1.00 1.50 2.00 2.50 3.00 3.50 4.00 4.50 F P C D HD Averagenumberofpostsandreplies Student Grades Forum Posts Forum Replies
    20. 20. The mythical mean 0 2 4 6 8 10 12 14 16 18 20 Averagenumberofcontributionsperstudent Moodle courses across a single year
    21. 21. Single HD student 0 5 10 15 20 25 30 35 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 Numberofforumcontributions Individual courses
    22. 22. Task
    23. 23. First day of access -6 -4 -2 0 2 4 6 F P C D HD Firstdayofaccess Student Grades First Day of Access (n=35623) Distance Students
    24. 24. SNAPP
    25. 25. Gephi
    26. 26. BIM
    27. 27. Assessment Graduate Attribute Average Levels Learning Outcome Graduate Attribute Average Levels 0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2 ga1 ga2 ga3 ga4 ga5 ga6 ga7 ga8 AverageGraduateAttributeLevel CQUni Graduate Attributes Average Graduate Attribute Levels by Assessment & Learning Outcomes - CQUni (2011) Assessment Graduate Attribute Average Levels Learning Outcome Graduate Attribute Average Levels
    28. 28. www.knewton.com
    29. 29. Individual/specific – Institutional/vague http://farm6.staticflickr.com/5002/5226383821_378b5a136e_z.jpg
    30. 30. Layers of Learning Analytics Micro- Meso- Macro- Process-level Institutional Cross- institutional Learner and teacher Department, Uni versity Region, state, int ernational Social network analysis, NLP, assessing engagement Risk-detection, intervention and support services Optimisation, external comparison, regulatory reporting