Ascilite 2012


Published on

Published in: Education
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • My name is…..Analytics is getting a great deal of press lately with a lot of articles talking about the amazing potential of academic analytics and learning analytics.We are taking a slightly different approach today by looking at some of the problems that we believe might be associated with analytics in higher education based on our experience with an analytics project over the last four years.
  • But first some definitions for some of the terms I’ll be using this afternoon.
  • First of all managerialismUniversitiesare increasingly managed as if they were businesses in a competitive marketplaceAccountability for public funding requires the rational allocation of resources and intentional management of changeThe ubiquitous adoption of LMS has been linked with managerialism due to their ability to provide orderly mechanisms for control over online learning and teachingA key problem from our perspective with the teleological or top-down management of learning environments is that they assume the systems and their components are stable and predictable.
  • Educational data mining in our context is the gathering of data from the various information systems used by university staff and students and exploring the data for items of significanceEducational data mining includes sources such as computer logs, student performance and demographics and so on.
  • Academic analytics is the use of data collected by educational data mining by universities.Often used to extract information about attrition, retention, pass rates etceteraMost universities have business intelligence areas who are now delving into academic analyticsCorporate data dashboards are one often familiar example of academic analytics
  • Learning analytics is again the use of data developed through educational data mining but its more focused on better understanding and optimizing learning and the learning environment.Social network analysis, personalization and adaptation are some of the methods used with learning analytics.
  • Adapted from George Siemens’ blog post last year It’s a very rough guide only but helps define the distinction between academic analytics and learning analytics.Note that this is based on CQUniversity structures where degree programs are composed of individual courses.
  • The indicators project:Running since 2008This project started when the foundation members were responsible for supporting academics in the use of the then Blackboard learning management systemBlackboard backend database accessThe activity log had never been cleared which meant we had every student click within the LMS since it was installed 4 years previously.
  • We then managed to get access to the student administration system which had grade and demographic information on each and every studentSo we then started looking at correlations between student activity on the LMS and their resulting grades amongst other thingsWe are at a point now where we have established our own servers and databases that draw data from a range of university systems and this allows us to look at a whole range of interesting things around online learning.
  • Basically all we are doing here is combing the data from a number different university systems that typically don’t talk to one another into a single data sourceThis enables us to produce some interesting correlations between student grades and their online behaviours
  • This chart is simply correlating the number of student clicks on our Moodle LMS with their resulting grades.The horizontal axis is student grades according to CQUniversity’s grade definitionsThe vertical axis is the the average number of clicks each grade group made within the LMS
  • This chart is taken from approximately 400,000 student forum posts. The students were grouped by grade and we looked at the average number of questions marks in the text for each grade group.
  • More recently we have been experimenting and thinking about how to best use the data we have in the context of enhancing learning and teaching.Particularly in light of the attention that analytics is getting in the higher education media at the moment.
  • Articles touting the advantages provided by analytics are everywhere at the moment such as this one from the The Australian Higher Education section in October
  • From Educause earlier this year
  • And the horizon report from last year.In fact the Australian horizon report from this year that Neil Selwyn mentioned yesterday, reports analytics as being in one year or less category in terms of time to adoption
  • As a group of academics who come from an IT background, we get a little concerned when we hear this sort of hype around a particular technologyA couple of people have made mention of the hype cycles associated with emerging educational technology such as Neil Selwyn yesterday who spoke about hype, hope and disappointment. This particular hype cycle is from Gartner I would suggest that analytics currently sits somewhere between the technology trigger and the peak of inflated expectations.So most universities are now thinking about analytics and how they can make use of the data they have been collecting to improve they way they do things. During our journey over the last four years, we have identified a number of potential problems that we anticipate universities will likely face with regards to their own analytics projects
  • The four problems we anticipate are:Abstract representations of data hiding the underlying detailTypical university organisational structures are likely to limit the potential of analytics.The age old problem of confusing correlation with causationAnd organizations assuming causality in analytics data.
  • I think Gardner Campbell summed it up nicely with this quote from the learning analytics and knowledge conference earlier this year.
  • Lets look at the average number of forum posts and replies for over 39,000 distance student course combinations.You can see the correlation between the students’ level of forum activity and their grades and this, for me, fits nicely with my confirmation bias that suggests that this should be so.
  • But if we look at the average number of forum contributions per student across all courses in a single year, you will note how significant the variation really is. So most courses are taught by different teachers who had different design philosophies and consequently they all tend to utilize the discussion forums in different ways.LABEL
  • And if we drill down even further to just a single student who has received a HD in all of the courses they have completed to date, you can see diversity in the number of forum contributions they make across their 18 courses.Our experience with the Indicators project is that the devil is very much in the detail when it comes to aggregated analytics data in that the data aggregations we see at the macro level of analysis doesn’t really help us a great deal at the micro level
  • The next of the four problems likely to be experienced by universities entering into the analytics arena relates to typical university organizational structures.
  • Most universities are structured in a very deliberate reductionist way. People are organized into units base on their task or role within the university. Eg IT folk tend to live in the IT area and academics are arranged according to discipline.While these divisions between organizational units are imaginary, I’m sure all of here have experienced the frustration associated with a lack of cross unit cooperation.
  • The ever constant battle for budgets often leads to inter-departmental rivalry which can hinder the cross organizational collaboration that analytics requires.For exampleBecause we aren’t IT and it is unusual to give non-IT folk access to the backend databases of systems such as Blackboard, Moodle, Peoplesoft etcetera.Perhaps the bigger issue associated with analytics is that it requires a set of skills, that given typical university structures, doesn’t typically exist in a single department. For example database administrators and educational developersSo org structures can impact on what can be achieved with analytics.
  • The next problem I would like to highlight is the confusion between correlation and causation. We simply can’t assume that the correlations we are noticing with analytics is based on causation. For example:
  • If we look at the simple pattern from before that correlated student forum posts and replies with their resulting grades.
  • And how the correlation didn’t necessarily hold true as we drilled down from the macro to a micro level.The data extracted from the learning environments is data stemming from a very complex interplay of variables and it would be dangerous to assume otherwise. Every human operating within these learning environment is different, and will behave differently from hour to hour, much less from term to term, and it would be prudent to exercise some caution when making assumptions based on analytics derived information.
  • The fourth and final problem I would like to talk about today and one that worries me greatly relates to the assumption that analytics data is based on causality. That is that university management uses analytics information as a metric or KPI.The danger is if the correlations exposed by the data are assumed to be universal constants when as we have come to learn, the underlying system is vastly more complex.
  • For example, when CQUniversity adopted Moodle a set of minimum course standards was mandated. This meant that every Moodle course had to have a discussion forum whereby students could freely interact with staff and if you remember the correlation between student forum contributions and their resulting grades from before, you can understand how they arrived at this decision.While arguably a noble goal, the result was that 39% of these forums had less than five contributions by staff and students for a whole range of valid, and sometime not so valid reasons.The system from which analytics is drawing its data from is very complex with a vast array of variables and interdependencies.
  • So with these problems in mind, we are beginning to look at analytics through the lens of complex adaptive systems.
  • Complex adaptive systems are a variation on complex systems and have been described as systems that involve many components that adapt, learn or change as they interact.Each agent within a complex adaptive system is nested within other systems which are all evolving and interacting that we cannot understand any of the agents or systems without reference to the others.In simple terms, context is king when it comes to using analytics to improve learning and teaching so we can’t easily interpret analytics data without reference to the context in which it was derived
  • So from a perspective of using analytics to enhance learning and teaching, we are much less concerned with the retrospective data representations and interpretations at the macro level even thought the correlations at this level often appear to be quite distinct.
  • We are aiming to focus more on the micro levels.
  • So what we are talking about here is a bottom up approach to the representation and interpretation of analytics dataAnd to some extent, this stands in opposition to the way that universities and their learning and teaching is being managed at the moment.
  • So given that analytics data is extracted from complex array of interacting systems, we are intending to focus our efforts into the bottom right hand corner of this slide at the course levelThat way the people operating within the context are the people interpreting and making decisions based on the analytics data. While we see the importance of of providing analytics derived insights to students much like Purdue university have done with their signals project, initially its likely to be the teacher who has the right mix of closeness and knowledge about the context from which the analytics information is extracted.
  • So we are aiming to provide the teaching academics with better information. To borrow David’s car analogy, its about using analytics to augment the driver. In a lot of modern cars when you get out and leave the lights on, they turn off the lights for or give that annoying beep when you haven’t fastened your seat belt. The car is smart enough to help the driver out with the vehicles operation. These sorts of augmentations are what we would like to see within the LMS.
  • We trialed what was really just a proof of concept about 12 months or so ago with a web page that combined data from the student admin system with clickstream data from the LMS to make a determination about which students most need some help at this particular point in the term.This was coupled with a mail-merge facility that helped the teaching academic facilitate a basic intervention
  • I guess to link with the Neil Selwyn’s keynote from yesterday, we are endevouring to use analytics to nurture evolution from the present rather than leap in boots and all with some revolutionary new approach.
  • Ascilite 2012

    1. 1. Analytics and Complexity Learning and leading for the future Colin Beer (CQUni) David Jones (USQ) Damien Clark (CQUni)
    2. 2. Some definitions
    3. 3. Managerialism“The teleological approach to the managementof universities is known as managerialism andits influence has extended to how universitiesmanage their learning and teaching”Beer, Jones & Clark (2012)
    4. 4. Educational Data Mining “Educational Data Mining is an emerging discipline, concerned with developing methods for exploring the unique types of data that come from educational settings, and using those methods to better understand students, and the settings which they learn in.” George Siemens, 2011 (
    5. 5. Academic Analytics…“marries statistical techniques andpredictive modeling with the large data setscollected by HEI, including those collectedby the LMS. Academic analytics has beendescribed as business intelligence for HEIand is focused on the needs of theinstitution, such as recruitment, retention andpass rates”Open University, 2012
    6. 6. Learning Analytics…“the measurement, collection,analysis and reporting of data aboutlearners and their contexts, forpurposes of understanding andoptimizing learning and theenvironments in which it occurs”George Siemens, 2011(
    7. 7. National Level Educational Data Academic Institutional Level Analytics Mining Faculty Level School Level Learning Program Level Analytics Course LevelAdapted from Siemens (2011)
    8. 8. • Multiple CQUniversity internal L&T grants• DEHUB research grant (2011)• Numerous publications• Numerous conference presentations• Established 2008
    9. 9. The Indicators story so far
    10. 10. Blackboard Moodle PeopleSoft PeopleSoft +80,000 +80,000 300M+ 80M+ Student Student records records results recordsMoodle 2 SRQ 30M+ +5000records records
    11. 11. Some simple patterns Hits (n=39087) 450 400Student clicks 350 300 250 200 150 100 50 0 WF F P C D HD Student Grades
    12. 12. Some simple patterns First Day of Access (n=35623) Distance Students 5 4First day of access 3 2 1 0 -1 F P C D HD -2 -3 -4 -5 Student Grades
    13. 13. Some simple patterns Number of question marks 2.5Average number of question marks 2 1.5 1 0.5 0 F P C D HD Student Grades
    14. 14.
    15. 15. The next big thing“BIG data sets showing what students doonline may prove as vital to education asgenome databases have been to geneticsor Europes Large Hadron Collider tophysics”The Australian (15th October 2012)
    16. 16. The next big thing“EDUCAUSE and the Bill and MelindaGates Foundation have targeted learninganalytics as one of 5 categories forfunding initiatives”Educause (2012)
    17. 17. The next big thingLearning analytics promises to harnessthe power of advances in datamining, interpretation, and modeling toimprove understandings of teaching andlearning, and to tailor education toindividual students more effectively.Horizon report (2011)
    18. 18.
    19. 19. Potential ProblemsAbstraction losing detailOrganisational structuresConfusion between correlationand causationAssumptions of causality
    20. 20. Abstraction losing detail“…the nature of learning analyticsand its reliance on abstractingpatterns or relationships from datahas a tendency to hide thecomplexity of reality”Gardner Campbell (2012)
    21. 21. Some simple patterns 4.50Average number of posts and replies 4.00 3.50 3.00 2.50 2.00 1.50 1.00 0.50 0.00 F P C D HD Student Grades Forum Posts Forum Replies
    22. 22. The mythical mean 20 18Average number of contributions per 16 14 12 student 10 8 6 4 2 0 Moodle courses across a single year
    23. 23. Single HD student 35Number of forum contributions 30 25 20 15 10 5 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 Individual courses
    24. 24. Organisational Structures
    25. 25. Design by division
    26. 26. Inter-departmental rivalry
    27. 27. Confusing correlation with causation
    28. 28. Some simple patterns4.504.003.503.002.502.001.501.000.500.00 F P C D HD
    29. 29. Single HD student353025201510 5 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18
    30. 30. Assumptions of causality
    31. 31. Minimum Standards 39% 61%
    32. 32. Complex adaptive systems
    33. 33. Complex adaptive systems“A CAS is a dynamic network ofsemiautonomous, competing and collaboratingindividuals who interact and coevolve innonlinear ways with their surroundingenvironment. These interactions lead to variouswebs of relationships that influence thesystem’s performance” Boustani (2012)
    34. 34. Macro level
    35. 35. Micro level
    36. 36.
    37. 37. National Level Educational Data Academic Institutional Level Analytics Mining Faculty Level School Level Learning Program Level Analytics Course LevelAdapted from Siemens (2011)
    38. 38.
    39. 39.