2014 02 learning analytics in an era of digitisation

1,182 views

Published on

Learning analytics in an era of digitisation

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,182
On SlideShare
0
From Embeds
0
Number of Embeds
129
Actions
Shares
0
Downloads
12
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Key Points We will briefly discuss the background to and current status on LA at CSU In “principles of learning analytics” we will explore a) what LA is and b) some key principles that are important for the successful application of LA In “LA in higher ed”, we will discuss how LA has been embraced by the higher ed sector and some of the issues around its rapid growth In “lessons learned”, we identify some of the “success factors” for implementing LA In “future developments”, we talk about where we see LA heading at CSU and across the higher ed sectorWe will allow a few minutes for questions at the end of each of the 5 areas to break up the presentation
  • Key Points CSU operates across 5 main campuses in Australia (in regional NSW and ACT). In addition, we have a campus in Ontario, Canada and study centres in a most major centres. We also work with 15 partner institutions across Australia and Asia. The “tyranny of distance” was a real problem for CSU – many of our students live in regional or remote areas, or work full-time and study “on the side”. The institution has embraced a range of educational technologies to help us reach out to these students, and to also allow us to work collaboratively across different campuses. Online education is part of our DNA and we are currently developing online components for all subjects offered by the Uni. CSU has developed a Learning Analytics Strategy in 2013 with wide representation across the University. The Strategy addresses the development of a multi-dimensional learning analytics system to support adaptive practice by students and teaching staff and adaptive systems
  • Key Points The important point here is that we use a wide range of Ed Techs. In some ways, we are lucky that technology is used so ubiquitously across the Uni. Students and staff create “digital traces” as they interact with these technologies. It is these digital traces that become the fuel for an integrated learning analytics system
  • Key Points The important point here is that we use a wide range of Ed Techs. In some ways, we are lucky that technology is used so ubiquitously across the Uni. Students and staff create “digital traces” as they interact with these technologies. It is these digital traces that become the fuel for an integrated learning analytics system
  • Key Points The definition of LA provided comes from the Society for Learning Analytics Research (SOLAR) Importantly, LA is about influencing outcomes and practice. It is not just about standing back and observing. The point of it is to provide feedback – to prompt and inform adaptation and response by students; teaching, admin and support staff; and educational technology systems Philosophy at CSU that student agency is important – if we give them the insights and tools they will use them. This stems from the fact that many of our students are older and often already in the workforce, but it is also consistent with and supportive of our Graduate Learning Outcomes (eg work-ready and able to apply discipline expertise in professional practice, and able to learn effectively in a range of environments including online) So, broadly speaking, how does Learning Analytics prompt and inform adaptation...
  • Key Points Learning analytics serves a number of purposes/drivers: University-level drivers include things like the development of rounded students, who meet our Graduate Learning Outcomes Course-level drivers might include enhancing the quality of course design Subject-level drivers might include student success and engagement, quality subject design, quality teaching and quality learning and teaching environments LA applies a suite of metrics and analytic methods (mediated by our technological affordances) to allow audiences access to data, via presentation formats that allow efficient exploration and interpretation of data to create knowledge Presentation formats can include dashboards, reports, lists and dynamic feedback Importantly, learning analytics is about driving adaptation among its users/audiences. Those adaptations can occur at multiple levels – for example, at the level of the learning design, at the behavioural level for students, academics and even student support staff, and/or at the systems level. Critically, this adaptation is evidence-based through the insights provided by the learning analytics
  • Key Points There is the capability to generate masses of data through LA. Even “off the shelf” systems, like Blackboard Analytics, are capable of efficiency bringing together huge volumes of previously disparate dataBut just because we can measure it doesn’t mean that we should measure it or that we need to measure it Learning Analytics systems need to be linked to and reflective of our evidence-based understanding of student learning and teaching What we measure must be directly relevant to both the learning design and the learning process. That is, we aim to measure student behaviour with regard to features of the learning environment that are important to a particular learning design ... and exclude those features that are not relevant to that design. Need to avoid the “copycat” mentality (this worked at Columbia, so it must be important, right? Or, all the research shows that forum activity is a great predictor of student grades. But I don’t use forums in my subject design...) and tailor learning analytics to the theories of learning and teaching that operate in our own institutions. Importantly, this means also thinking about the process of learning within the individual. Is the individual responding the learning design in the ways intended? If not, why not? Without this theoretical context, learning analytics is just data gathering in search of correlations without meaning But learning analytics also needs to look beyond just theory and pedagogy to the objectives of the institution – there is the opportunity to link learning analytics to Graduate Learning Outcomes (addressing the question of whether our students are actually achieving the overarching outcomes that we intend) and other strategic objectives of the university
  • Key Points If our focus is to predict and influence student success, we need to understand the factors that enable such success At CSU, we conceive of student success as a product of: the student – their level of engagement with the learning opportunities provided; the teaching we provide – the quality of the design and facilitation of learning opportunities; and the institution – the quality and timeliness of the support we provide to students along their journey. This support comes from many sources, but most notably from the faculty in which the student is enrolled, Academic Support, Student Services and Library Services These features, though, sit within the context of the Uni strategic direction and its policies The challenge is provide a suite of analytics relevant (and tailored) to all three success factors. This is, in fact, an organisation-wide challenge, which needs to draw together the intelligence and insights gathered by a range of players
  • Key Points There is the potential for learning analytics to be seen as “big brother” ... or the “panopticon” (which may not entirely be a bad thing) If trust is lost in LA, students will opt out or artificially alter their behaviour b/c they know they are being observed (or worse) and you’ll be left with a “hollow vessel” Transparency and discipline around the collection and use of data is absolutely essential. In theory, in LA we are on strong ground – we collect, analyse and report on data for the purpose of helping improve student outcomes. A noble cause. But how far is it reasonable to go in this? Technology allows us to do many things. Should we monitor student facebook and twitter accounts? Should we monitor “private” online chats between students? Even a simple principle like using data for the purpose for which it was gathered can be problematic. Eg your institution’s bookshop says, “You’ve got all this data on which resources students are using, tell us which textbooks we should order for next semester to improve our revenue...” Or an academic says, “I’ve got this student who seems totally unengaged, but knocks the ball out of the park in exams. I suspect they are cheating, what evidence can you provide me?” Having an approach to LA strongly grounded in theory and pedagogy allows us to be transparent and credible when talking about what we are measuring and why. It also, brings learning design and its delivery into the picture, ensuring that LA does not just become about “blaming the student” for failure
  • Key Points The important point here is that we use a wide range of Ed Techs. In some ways, we are lucky that technology is used so ubiquitously across the Uni. Students and staff create “digital traces” as they interact with these technologies. It is these digital traces that become the fuel for an integrated learning analytics system
  • Key Points The growth of learning analytics is only in part explained by the fact that it is good/helpful for students and teaching staff While rapid expansion of technological capability underlies LA’s growth (we have increased availability, range, sophistication and uptake of educational technologies in recent years), the interest in the field also stems from the fact that it is one part of the answer to the complicated question facing Universities (and, indeed, all teaching institutions) of: how to do more for more with less? It is LA’s promise of improving student outcomes, at scale, that drives growth in learning analytics. In this way, LA become a key part of an institution’s competitive advantage by enabling a “better” student experience and outcome through a) timely and personalised formative feedback on student progress and b) the more efficient and responsive application of institutional resources to support the student on their journey – in a context that risks becoming more de-personalised and commoditised, as enrolments grow and course offerings expand and open-up. But...
  • Key Points The accessibility of built-in analytics packages in some LMSs has helped foster interest and uptake of Learning Analytics, particularly among academics and teaching staff. These packages are very good at what they do, it’s just that “what they do” (like any “off the shelf” solution) has limited utility. The next phase for institutions is to start to design LA systems that are relevant to: their institutions – the technologies and systems used, the needs and expectations of a variety of user-groups – from academics to Student Support staff; their students – their contexts and backgrounds, their needs and expectations; and their teaching – the learning and teaching philosophies and theories employed and the learning designs in place To do so, there are challenges around: understanding the full range of analytic opportunities available; capturing data from the range of systems/technologies employed at a student-level; integrating data sources – particularly challenging where learning designs take in technologies outside the LMS; and implementing data governance around such a systems to ensure that the right data is being collected and used in the right ways by the right people at the right times
  • Key Points The important point here is that we use a wide range of Ed Techs. In some ways, we are lucky that technology is used so ubiquitously across the Uni. Students and staff create “digital traces” as they interact with these technologies. It is these digital traces that become the fuel for an integrated learning analytics system
  • Key Points Learning Analytics is about people and is used by people – to focus on it as a technical problem is to only see half the picture Stakeholders from the across the institution need to be engaged in the system – they need to trust its intent and its application. If Learning Analytics is seen by either students or staff as an exercise in control or scrutiny it will experience resistance (and the benefits lost). If academics just see it as another “layer of work” forcing them into a more time-consuming “case management” approach to their students, it is unlikely to be embraced. Therefore, it is important to understand who (in the institution) is responsible for leading/facilitating what responses to insight that emerges from the data. Learning Analytics also has to learn. There needs to be a commitment to ongoing data mining and evaluation of the analytics, models and algorithms used. As organisational circumstances change, as user needs/expectations change, as students change, as learning designs and educational technologies change, as learning theory and science evolves so to should our approaches to Learning Analytics. Seeing Learning Analytics as a “set and forget” black-box of clever analytic tricks will no doubt lead to obsolescence
  • Key Points The important point here is that we use a wide range of Ed Techs. In some ways, we are lucky that technology is used so ubiquitously across the Uni. Students and staff create “digital traces” as they interact with these technologies. It is these digital traces that become the fuel for an integrated learning analytics system
  • Key Points Is this our future student? Possibly not, but the future of LA lays in being able to infer the occurrence and the quality of learning at the level of an individual student That is by using analytic techniques that don’t just look at patterns of student behaviour across an Online Learning Environment, but also explore, in-depth, the patterns of behaviour in specific instances: student behaviour in online tests and tutorials, and being able to distinguish someone who knows the answers from someone who is guessing or someone who genuinely doesn’t know the answer vs someone who just made a slip (and possibly adapt the delivery of subsequent content); student language in forums, chats, blogs, reflective journals (discourse analytics). Research by Mercer et al has identified different types of discourse associated with the occurrence of quality learning (egDisputational, Cumulative and Exploratory dialogue); student use of online reading resources (eg are they highlighting the right/relevant sections of text, are they making annotations that are relevant to the key points); and true multi-dimensional information: integrating online behaviour and off-line behaviour (wearable computing) the list will grow... Thus, while we may not get to the point where we have automated neural scanning equipment in classrooms to measure the neurological changes that denote learning, we can draw inferences about learning from detailed study of student behaviours in defined online task and social environments. Once we can draw an inference about a student’s learning, our challenge is create educational technologies and systems that are responsive at the level of the individual. That is, to achieve a true personalisation of the learning environment – not based on superficial preferences, but built on an evidence-based understanding of what works for a particular student learning a particular content
  • Key Points The important point here is that we use a wide range of Ed Techs. In some ways, we are lucky that technology is used so ubiquitously across the Uni. Students and staff create “digital traces” as they interact with these technologies. It is these digital traces that become the fuel for an integrated learning analytics system
  • 2014 02 learning analytics in an era of digitisation

    1. 1. Learning Analytics in an Era of Digitisation February 2014 Simon Welsh Assoc Professor Philip Uys Senior Learning Analytics Officer, Strategic Learning and Teaching Innovation Charles Sturt University Director, Strategic Learning and Teaching Innovation Charles Sturt University siwelsh@csu.edu.au puys@csu.edu.au DIVISION OF STUDENT LEARNING
    2. 2. Contents 1. CSU Context 2. Principles in Learning Analytics 3. Learning Analytics in Higher Education 4. Lessons Learned 5. Future Developments DIVISION OF STUDENT LEARNING
    3. 3. 1. CSU Context • Charles Sturt University is a regional and international, multi-campus institution with around 40,000 students • Approximately 60% of students undertake distance education courses, with a further 15% enrolled in blended courses • CSU has invested heavily in educational technologies to provide reliable and equitable access to resources for students and staff alike • In 2013, we developed a Learning Analytics Strategy which is now moving to implementation DIVISION OF STUDENT LEARNING
    4. 4. 1. CSU Context • Some of our educational technologies, include... Course Eval Yammer PebblePad Turnitin EASTS InPlace Blackboard Learn Digital Object Management System Subject forums and wikis PODs Adobe Captivate mLearn Bridgit Adobe Connect CSU Replay Simulations SMART Tools DIVISION OF STUDENT LEARNING
    5. 5. Questions... DIVISION OF STUDENT LEARNING
    6. 6. 2. Principles in Learning Analytics • • Learning Analytics is defined as: the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs (SOLAR) Learning Analytics is about helping students succeed by providing: o students with the self-awareness and insight to optimise their learning behaviours; o teaching and support staff with insight to make meaningful adaptations to their practice, as well as effective interventions; and o evidence to enable the adaptation of learning and teaching systems DIVISION OF STUDENT LEARNING
    7. 7. 2. Principles in Learning Analytics Drivers University Course Subject Metrics and Methods Presentation Formats Affordances of LA Technologies Audiences Adaptations -Design - Behaviour - Systems DIVISION OF STUDENT LEARNING
    8. 8. 2. Principles in Learning Analytics • Learning Analytics is sometimes referred to as “big data” in an educational context – but there is a danger in this short-hand • Learning Analytics must be proximal to learning theory/science and design • Theory, pedagogy and University objectives on different levels help us understand what to measure, why and how to respond • Learning Analytics that is not connected to theory, pedagogy and outcomes is just “counting clicks” DIVISION OF STUDENT LEARNING
    9. 9. 2. Principles in Learning Analytics • Student success is a product of the interplay of the student, the teaching and the institution Student Engagement University Strategy and Policy Student Success Support Faculty Academic Support Student Services Learning and Teaching Design and Delivery DIVISION OF STUDENT LEARNING
    10. 10. 2. Principles in Learning Analytics • Learning Analytics requires trust to work • It is essential to have a strong Ethics and Privacy Framework in place • A key principle: that data is only used for the purpose for which it was originally gathered • The legal aspects may actually be the most straightforward – earning the trust of students and staff may be the real challenge • Theory and pedagogy gives focus and purpose DIVISION OF STUDENT LEARNING
    11. 11. Questions... DIVISION OF STUDENT LEARNING
    12. 12. 3. Learning Analytics in Higher Education • Increasing usage of educational technologies such as LMSs, etc (as described before) and wider usage in Universities in this era of digitisation • Learning Analytics is a rapidly growing field in higher education in Australia and around the world: “the data tsunami” (Simon Buckingham-Shum) • This growth is driven by a number of strategic issues affecting Universities – such as increasing enrolments, higher student expectations, lower funding • Learning Analytics becomes the new competitive advantage DIVISION OF STUDENT LEARNING
    13. 13. 3. Learning Analytics in Higher Education • With increasing interest in the field and the release of easy-touse analytic packages, Learning Analytics has been fragmented in many institutions • While the embrace of Learning Analytics should be encouraged, the opportunity is to move beyond the (often) simplistic analytics in many pre-built packages to develop analytics that reflect our institutions, our students and our teaching • This means moving to a multi-dimensional landscape, where we source and integrate data about a student from a wide variety of sources (often outside the LMS) DIVISION OF STUDENT LEARNING
    14. 14. Questions... DIVISION OF STUDENT LEARNING
    15. 15. 4. Lessons Learned • Learning Analytics is not just a technical challenge – it’s about people, culture and practice • For Learning Analytics to truly be an “adaptation engine”, strong stakeholder engagement (and trust) is essential • Critical to think through roles and responsibilities: Learning Analytics can’t just be about creating more work for academics/teaching staff • It is also an evolutionary process in its own right – Learning Analytics doesn’t just help others adapt, it must be adaptive in itself • LA required university-wide collaboration and integration DIVISION OF STUDENT LEARNING
    16. 16. Questions... DIVISION OF STUDENT LEARNING
    17. 17. 5. Future Developments Ars Electronica (CC BY-NC-ND) DIVISION OF STUDENT LEARNING
    18. 18. Questions... DIVISION OF STUDENT LEARNING
    19. 19. Summary • Learning Analytics is about prompting and informing adaptation • Analytics are required on different levels including course, subject and university • To do so requires our analytics to be proximal to university objectives, learning and teaching theory and design • Learning Analytics operates on trust • Learning Analytics works best where it is multi-dimensional • To achieve this, broad stakeholder engagement is required • The future is about inferring and influencing the occurrence of learning at an individual-level both online and off-line DIVISION OF STUDENT LEARNING
    20. 20. Thank You Simon Welsh Assoc Professor Philip Uys Senior Learning Analytics Officer, Strategic Learning and Teaching Innovation Charles Sturt University Director, Strategic Learning and Teaching Innovation Charles Sturt University puys@csu.edu.au siwelsh@csu.edu.au DIVISION OF STUDENT LEARNING

    ×