Learning Analytics &
Learning Design
Patrick Lynch
p.lynch@hull.ac.uk
@thebigparticle
UK City of Culture 2017-20
2
Apereo Software Communities
OA LAP
OA LRS
OA Dashboard
Agenda
4
• Learning Analytics, where am I coming from?
• why the emphasis on Learning Design
• so what?
• what next?
Learning Analytics: where
am I coming from?
| 5
A plug
6
Sclater, N. (2017).
Learning Analytics Explained.
Routledge
7
2012
Stages of analytics use
8
Basic Analytics
• What has happened e.g. HESA returns
Automated Analytics
• What is happening e.g. Set “rule” to trigger alert if
grade drops below a certain level
Predictive Analytics
• What might happen e.g. use large
amount of historical data to create
predictive models
So what has changed?
9
• 5 Vs of Big Data
– Volume
– Velocity
– Variety
– Variability
– Veracity
• Or was that 7 V’s?
– Visualisation
– Value
Definitions of Learning Analytics
10
The measurement, collection, analysis and reporting of data
about learners and their contexts, for purposes of
understanding and optimizing learning and the environments
in which it occurs. Siemens, G. (2011).
Analytics is the process of developing actionable insights
through problem definition and the application of statistical
models and analysis against existing and/or simulated future
data. Cooper, A (2012)
Descriptive versus predicative analytics
11
My journey
| 12
My journey
13
2012
2013
Course fingerprint
14
15
Patrick’s
favourite
17
Agile development – students as product owners
| 18
‘As a student I just want to
know how I am doing’
Learning design
| 19
Learning design
| 20
“We are moving from an instructional paradigm to a learning
paradigm — from offering information to designing learning
experiences ... from thinking about [courses and programs] as
an aggregation of separate activities to becoming an integrated
design.”
Bass, R. (2012)
Learning design
21
“A ‘learning design’ is defined as the description of the teaching-
learning process that takes place in a unit of learning (e.g., a course,
a lesson or any other designed learning event). The key principle in
learning design is that it represents the learning activities and the
support activities that are performed by different persons (learners,
teachers) in the context of a unit of learning”
Koper (2006)
“not to transmit knowledge to a passive recipient, but to structure
the learner’s engagement with the knowledge, practising the high-
level cognitive skills that enable them to make that knowledge their
own”
(Laurillard, 2008)
Carpe Diem
| 22
Salmon, 2011
Learning design
23
“A methodology for enabling teachers/designers to make more
informed decisions in how they go about designing learning
activities and interventions, which is pedagogically informed
and makes effective use of appropriate resources and
technologies… Learning Design as an area of research
and development includes both gathering empirical
evidence to understand the design process, as well as
the development of a range of learning design resource, tools
and activities.”
Conole (2013)
24
Galley (2015)
| 25
Niall Sclater 2017
26
Context
27
Let’s not forget: Learning analytics are about learning
As a comparable analogy to teaching to the test rather than
teaching to improve understanding, learning analytics that do not
promote effective learning and teaching are susceptible to the use
of trivial measures such as increased number of log‐ins into an
LMS, as a way to evaluate learning progression. In order to avoid
such undesirable practices, the involvement of the relevant
stakeholders – e.g., learners, instructors, instructional designers,
information technology support, and institutional administrators
– is necessary in all stages of the development, implementation,
and evaluation of learning analytics and the culture that the
extensive use of data in education carries.
Gašević, D., Dawson, S. & Siemens, G.
Evidence
So what?
How do you know that the
learning design is working??
“Throughout the history of education, the
adoption of instructional programs and
practices has been driven more by
ideology, faddism, politics, and marketing
than by evidence.”
Slavin (2008)
“it is bordering on the unethical to
implement untried and untested
recommendations in educational
practice, just as it is unethical to use
untested products and procedures on
hospital patients without their consent”
Cohen, Manion and Morrison (2011)
32
| 33
Faced with access to large collections of
data and powerful open source analysis
software, researches (sic) will be subject
to a variety of temptations to poke about
in this data in thoroughly unprincipled
ways. While these fishing expeditions
may uncover seemingly interesting
relationships between constructs, without
an interpretive framework grounded in
specific theoretical commitments, the
data tail may come to wag the theory dog.
Atkisson & Wiley (2011)
Research design approach
34
• what is my research question?
– is my learning design successful?
• what data will I need to collect to answer that question?
• how will I collect that data?
• how will I analyse that data?
– which patterns represent successful outcomes?
– which patterns represent less successful
outcomes/failure?
• how can use the analysis to improve the learning design?
7 Principles for good practice in undergraduate
education
35
1. encourages contacts between students and faculty
2. develops reciprocity and cooperation among students
3. uses active learning techniques
4. gives prompt feedback
5. emphasizes time on task
6. communicates high expectations
7. respects diverse talents and ways of learning.
Chickering and Gamson, 1987
7 Principles for good practice in undergraduate
education
36
1. encourages contacts between students and faculty
2. develops reciprocity and cooperation among students
3. uses active learning techniques
4. gives prompt feedback
5. emphasizes time on task
6. communicates high expectations
7. respects diverse talents and ways of learning.
Chickering and Gamson, 1987
How learning works: 7 Research-Based Principles
for Smart Teaching
37
• students’ prior knowledge can help or hinder learning
• students’ motivation determines, directs, and sustains what they do to learn
• how students organize knowledge influences how they learn and apply what
they know
• goal-directed practice coupled with targeted feedback enhances the quality of
students’ learning
• students’ current level of development interacts with the social, emotional,
and intellectual climate of the course to impact learning
• to develop mastery, students must acquire component skills, practice
integrating them, and know when to apply them.
• to become self-directed learners, students must learn to monitor and adjust
their approaches to learning.
Ambrose et al. 2010
Interventions: reactive
design
| 38
Loop
| 39
Assessing whether “pedagogical intent” is reflected in student activity
Corrin et al describe the development of a learning analytics tool, Loop, which
was funded by the Australian Government’s Office of Learning and Teaching.
This presents data to teachers on the interactions of students.
Loop then uses data from the LMS to assess whether the intent is reflected in
the students’ interactions, and presents this to teachers through dashboards
and reports.
Specifically, the tool integrates course structures and schedules in its
visualisations to help evaluate the effectiveness of the learning activities.
| 40
https://www.unicon.net/LA-quick-start
Increasing engagement through course
redesign at UMBC
41
Because the analytics showed that adaptive release was increasing
engagement, Hardy redesigned the entire course to use this feature. This then
resulted in students achieving scores in a common final exam which were 20%
higher than those who did not take his course. Hardy’s students also earned
higher grade point average scores in the following course: Intermediate
Accounting.
It is recognised that Hardy has put particular effort into redesigning his course,
however he claims now to be spending less time administering it, and sees
himself as more of a coach and facilitator rather than a lecturer. Thus not only
have the learning tools and activities been adapted as a result of the analytics,
but his teaching practice has too.
Becoming more agile
42
What next?
| 43
For Learning Analytics and Learning Design
• Arguably then learning design needs learning analytics in order to
validate itself. However it also works the other way: learning
analytics cannot be used effectively without an understanding of
the underlying learning design, including why the particular tools,
activities and content were selected and how they were deployed
• Meanwhile it is clear that there is a growing necessity for the
diverging communities of learning design and learning analytics
to come together to develop languages and tools to address the full
lifecycle of curriculum development and enhancement.
Sclater, 2007
At Hull
• design as a recognised activity
• design goals clearly articulated
• that identify the data and methodology up front
• that builds knowledge through a set of LD patterns
shown to work (or not) in given contexts
• demand driven by TEF?
https://en.wikipedia.org/wiki/Ghostbusters_(franchise)#/media/File:Ghostbusters_log
o.svg
Questions
p.lynch@hull.ac.uk
@thebigparticle

Digifest 2017 - Learning Analytics & Learning Design

  • 1.
    Learning Analytics & LearningDesign Patrick Lynch p.lynch@hull.ac.uk @thebigparticle
  • 2.
    UK City ofCulture 2017-20 2
  • 3.
    Apereo Software Communities OALAP OA LRS OA Dashboard
  • 4.
    Agenda 4 • Learning Analytics,where am I coming from? • why the emphasis on Learning Design • so what? • what next?
  • 5.
    Learning Analytics: where amI coming from? | 5
  • 6.
    A plug 6 Sclater, N.(2017). Learning Analytics Explained. Routledge
  • 7.
  • 8.
    Stages of analyticsuse 8 Basic Analytics • What has happened e.g. HESA returns Automated Analytics • What is happening e.g. Set “rule” to trigger alert if grade drops below a certain level Predictive Analytics • What might happen e.g. use large amount of historical data to create predictive models
  • 9.
    So what haschanged? 9 • 5 Vs of Big Data – Volume – Velocity – Variety – Variability – Veracity • Or was that 7 V’s? – Visualisation – Value
  • 10.
    Definitions of LearningAnalytics 10 The measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs. Siemens, G. (2011). Analytics is the process of developing actionable insights through problem definition and the application of statistical models and analysis against existing and/or simulated future data. Cooper, A (2012)
  • 11.
  • 12.
  • 13.
  • 14.
  • 15.
  • 16.
  • 17.
  • 18.
    Agile development –students as product owners | 18 ‘As a student I just want to know how I am doing’
  • 19.
  • 20.
    Learning design | 20 “Weare moving from an instructional paradigm to a learning paradigm — from offering information to designing learning experiences ... from thinking about [courses and programs] as an aggregation of separate activities to becoming an integrated design.” Bass, R. (2012)
  • 21.
    Learning design 21 “A ‘learningdesign’ is defined as the description of the teaching- learning process that takes place in a unit of learning (e.g., a course, a lesson or any other designed learning event). The key principle in learning design is that it represents the learning activities and the support activities that are performed by different persons (learners, teachers) in the context of a unit of learning” Koper (2006) “not to transmit knowledge to a passive recipient, but to structure the learner’s engagement with the knowledge, practising the high- level cognitive skills that enable them to make that knowledge their own” (Laurillard, 2008)
  • 22.
  • 23.
    Learning design 23 “A methodologyfor enabling teachers/designers to make more informed decisions in how they go about designing learning activities and interventions, which is pedagogically informed and makes effective use of appropriate resources and technologies… Learning Design as an area of research and development includes both gathering empirical evidence to understand the design process, as well as the development of a range of learning design resource, tools and activities.” Conole (2013)
  • 24.
  • 25.
  • 26.
  • 27.
  • 28.
    Let’s not forget:Learning analytics are about learning As a comparable analogy to teaching to the test rather than teaching to improve understanding, learning analytics that do not promote effective learning and teaching are susceptible to the use of trivial measures such as increased number of log‐ins into an LMS, as a way to evaluate learning progression. In order to avoid such undesirable practices, the involvement of the relevant stakeholders – e.g., learners, instructors, instructional designers, information technology support, and institutional administrators – is necessary in all stages of the development, implementation, and evaluation of learning analytics and the culture that the extensive use of data in education carries. Gašević, D., Dawson, S. & Siemens, G.
  • 29.
  • 30.
    So what? How doyou know that the learning design is working??
  • 31.
    “Throughout the historyof education, the adoption of instructional programs and practices has been driven more by ideology, faddism, politics, and marketing than by evidence.” Slavin (2008)
  • 32.
    “it is borderingon the unethical to implement untried and untested recommendations in educational practice, just as it is unethical to use untested products and procedures on hospital patients without their consent” Cohen, Manion and Morrison (2011) 32
  • 33.
    | 33 Faced withaccess to large collections of data and powerful open source analysis software, researches (sic) will be subject to a variety of temptations to poke about in this data in thoroughly unprincipled ways. While these fishing expeditions may uncover seemingly interesting relationships between constructs, without an interpretive framework grounded in specific theoretical commitments, the data tail may come to wag the theory dog. Atkisson & Wiley (2011)
  • 34.
    Research design approach 34 •what is my research question? – is my learning design successful? • what data will I need to collect to answer that question? • how will I collect that data? • how will I analyse that data? – which patterns represent successful outcomes? – which patterns represent less successful outcomes/failure? • how can use the analysis to improve the learning design?
  • 35.
    7 Principles forgood practice in undergraduate education 35 1. encourages contacts between students and faculty 2. develops reciprocity and cooperation among students 3. uses active learning techniques 4. gives prompt feedback 5. emphasizes time on task 6. communicates high expectations 7. respects diverse talents and ways of learning. Chickering and Gamson, 1987
  • 36.
    7 Principles forgood practice in undergraduate education 36 1. encourages contacts between students and faculty 2. develops reciprocity and cooperation among students 3. uses active learning techniques 4. gives prompt feedback 5. emphasizes time on task 6. communicates high expectations 7. respects diverse talents and ways of learning. Chickering and Gamson, 1987
  • 37.
    How learning works:7 Research-Based Principles for Smart Teaching 37 • students’ prior knowledge can help or hinder learning • students’ motivation determines, directs, and sustains what they do to learn • how students organize knowledge influences how they learn and apply what they know • goal-directed practice coupled with targeted feedback enhances the quality of students’ learning • students’ current level of development interacts with the social, emotional, and intellectual climate of the course to impact learning • to develop mastery, students must acquire component skills, practice integrating them, and know when to apply them. • to become self-directed learners, students must learn to monitor and adjust their approaches to learning. Ambrose et al. 2010
  • 38.
  • 39.
    Loop | 39 Assessing whether“pedagogical intent” is reflected in student activity Corrin et al describe the development of a learning analytics tool, Loop, which was funded by the Australian Government’s Office of Learning and Teaching. This presents data to teachers on the interactions of students. Loop then uses data from the LMS to assess whether the intent is reflected in the students’ interactions, and presents this to teachers through dashboards and reports. Specifically, the tool integrates course structures and schedules in its visualisations to help evaluate the effectiveness of the learning activities.
  • 40.
  • 41.
    Increasing engagement throughcourse redesign at UMBC 41 Because the analytics showed that adaptive release was increasing engagement, Hardy redesigned the entire course to use this feature. This then resulted in students achieving scores in a common final exam which were 20% higher than those who did not take his course. Hardy’s students also earned higher grade point average scores in the following course: Intermediate Accounting. It is recognised that Hardy has put particular effort into redesigning his course, however he claims now to be spending less time administering it, and sees himself as more of a coach and facilitator rather than a lecturer. Thus not only have the learning tools and activities been adapted as a result of the analytics, but his teaching practice has too.
  • 42.
  • 43.
  • 44.
    For Learning Analyticsand Learning Design • Arguably then learning design needs learning analytics in order to validate itself. However it also works the other way: learning analytics cannot be used effectively without an understanding of the underlying learning design, including why the particular tools, activities and content were selected and how they were deployed • Meanwhile it is clear that there is a growing necessity for the diverging communities of learning design and learning analytics to come together to develop languages and tools to address the full lifecycle of curriculum development and enhancement. Sclater, 2007
  • 46.
    At Hull • designas a recognised activity • design goals clearly articulated • that identify the data and methodology up front • that builds knowledge through a set of LD patterns shown to work (or not) in given contexts • demand driven by TEF? https://en.wikipedia.org/wiki/Ghostbusters_(franchise)#/media/File:Ghostbusters_log o.svg
  • 47.

Editor's Notes

  • #3 Open with a couple of pitches: HULL University of Hull lead party in the UK City of Culture From worst place (b.heard comparison website 2015 – Humber region) To one of the top 10 places in the world (Rough Guides, 2016) March 2017 Sunday times one of the best places to live The University – Not much online, not OU We are also not American or Australian – they have other contexts – maybe we can learn from them, but remain sceptical about transferability – context more to come.
  • #4 Apereo open source foundation A range of communities – including Xerte which most folks will know Its not just for developers
  • #7 Niall would have been here, but he is at the Learning Analytics and Knowledge conference in Vancouver New book from Niall published this month Chapter 6 – Learning analytics and curriculum design
  • #8  Great work pulled some of this together from CETIS in 2012
  • #9 Just a overview of transition in H.E, FE has been ahead in many areas for some time, especially that regular reporting A quick trip into LA. You probably know a lot already, but I want to get out where I am coming from. Always had Statistics (it is annoying how most VLE vendors have now changed Stats to Analytics A rise in the BI movement Educational data mining Web analyticse Learning analytics AI Social Network Analysis Information visualisations – dashboards – the rise of fitness apps – quantified self movement Analysis from other sectors retail an health informatics or example
  • #10 Variety – not just structured data, text, image audio … qualitative Value – turning it into value No source it seems everywhere and I can’t pin it down Data from 2014 Volume 100 terabytes per day in Facebook 4 to from a self driving car Velocity 300 000 tweets every minute Variety – where is data coming from, text, image, voice … Variability Try the word ‘sick’ as an example. Making meaning is hard. Veracity – accuracy of the data And of course tools have emerged
  • #11 The first is nice in that it talks about learners and the learning environment and optimising these. Learners and their contexts. The second for me crystallises that idea of actionable insights. We are looking for things to do. We have defined problems (questions) that we want to understand.
  • #12 Should learning analytics tools prompt instructors when learning outcomes are not being met?   The SNAPP tool integrates with an LMS forum to show teachers visualisations of how learners are interacting. In one evaluation of its use, the system was reported to be being largely used as a reflective tool, especially effective for analysing courses once they were completed. However, teachers were not using it to examine activity while the course was underway. This would have enabled immediate adaptation of the learning design if it was not functioning as expected.   Lockyer and Dawson suggest that additional functionality may be required to prompt instructors when the learning outcomes are not being met. For example, the development of a learning community may be an integral part of the learning design, but if students are not interacting as planned, the teacher needs to know that they should check the visualisations and take action to address the problem. Lockyer & Dawson, Learning Designs and Learning Analytics, p. 155. I very much like descriptive analytics, not prescriptive. Focus on students and their contexts – not at this stage particularly excited about prediction
  • #14  ALT-C 2012 Lee & Scoble looked at a PGCert HE module. The module required online use and they were interested in monitoring contribution to group work online This did indeed present a confrontation with a hidden reality of what we actually do know about student activity ALT-C 2013 Bridgeman, Gardiner & Lynch Analytics at the sharp end. Using excel to analyse and present simple correlation between expected activity and actual activity Started with simple stas visualisation in Sakai – why had these students accessed a resource after the course finished? Small data – it fit in Excel Since then working on live (overnight link to VLE activity log) Moved from 43 trackable events in the inbuilt statistics to 145 events in the system log. Started using Tableau (prototyping) and R
  • #15 Started to play with 25Million records from the VLE with an overnight update But still it seemed that the greatest insight was at a smaller scale To the left is lecturer activity
  • #17 Smaller view Boxplot median, min, max, first and third quartiles 1st to 3rd is IQR 3 x that range are outliers. 1.5 IQR is the inner fence i.e. suspected outliers That’s it for stats – because that didn’t work with students and it didn’t really work with staff However I was having some great conversations with staff and students I asked students what they wanted to know. We went through ‘I just want to know how I am doing’ to designing improvements in the course design.
  • #18 I’m also interested in these people - learners
  • #19 However I was having some great conversations with staff and students I asked students what they wanted to know. We went through ‘I just want to know how I am doing’ to designing improvements in the course design. We went AGILE Why agile – I was trained as a systems analyst These students had no interest in competition and no student focus groups we worked with wanted any completion . Product owner need to work inbetween development and the users Not a zero sum game BIG QUESTION – What is our goal – friends and family hospital survey – great satisfaction? Is that enough?
  • #21 Randy Bass. (2012) Disrupting Ourselves: The Problem of Learning in Higher Education References Clayton Christensen’s disruptive innovation
  • #22 Back in 2004 Rob Koper was looking at something called Educational Markup Language to describe learning, later with IMS to develop IMS-LD specification Dianna Laurillard also recognised the importance
  • #24 So, OU Learning Design can be seen as being concerned with designing for what students do to learn, placing students’ learning experiences at the centre of design, production, presentation and evaluation processes, and supporting the delivery of coherent, engaging and effective learning experiences both within modules and along qualification pathways. (Galley, 2015) Shouldn’t all learning design/teaching be conducted as at least action research
  • #25 OU Is light on students and reads as mostly pre implementation 5. Includes some analytics
  • #26 Bigger picture Where is the student?
  • #27 Looks like Kolb, simple, hides the complexity nicely Learners as the key contributor Metrics = analytics, visualisations, dashboards – actionable insight Interventions – change the course add a session Isn’t this learning design
  • #28 I’m also interested in these people - learners
  • #30 We are mostly using evidence that we happen to have found. Does it answer the questions we have? Good research design includes what evidence do we need, how will we capture it and how will we analyse it.
  • #31 We looked a bit at what LA I am interested in and we have looked at some of the LD work. There is promise here, but also risk. So what should we be doing?
  • #33 So we are looking to verify the learning design using LA LD needs LA to validate itself, but can LA work without understanding the underlying LD? One of the key texts on educational research
  • #34 Another quote from Niall’s book
  • #35 What are we trying to achieve?
  • #36 So lets consider theses seven elements as things we might question – can we use data to help? Chickering and Gamson US focus – colleague and I spent quite a bit of time trying to find a UK equivalent – no go Can we measure these? How would we need to design the learning to yield learning analytics? How can we analyse that data to inform actions? Evidence of satisfaction which may be good enough?
  • #37 So lets consider theses seven elements as things we might question – can we use data to help? Chickering and Gamson US focus – colleague and I spent quite a bit of time trying to find a UK equivalent – no go Can we measure these? How would we need to design the learning to yield learning analytics? How can we analyse that data to inform actions? Evidence of satisfaction which may be good enough?
  • #38 Which of these are embedded in your designs or the designs you have experienced? These look harder, partly because they are expressed more as design recipes than activity for the student, but we could break them into activities – can you think of how we might design feedback loops around these elements?
  • #42 Niall Sclater In 2009 Tim Hardy at the University of Maryland Baltimore County (UMBC) redesigned his course, the Principles of Accounting, in an attempt to increase student participation. Use of this course in the LMS subsequently became the most active in the whole institution. Hardy had used the “adaptive release” feature in the Blackboard LMS, which requires students to meet specified conditions in order to access content. For example, before being able to access an assignment on spreadsheets students had to pass a quiz based on a video he had created on the subject.
  • #43 Fastest student How do we build analytics into the learning design Niall talks about just-in-time changes made by the lecturer – the student is faster Next lecturer Agile teaching – feedback Next the course Based on feedback (not just words) Next institutional Remember my students who just wanted to know how they were doing?
  • #45 Adaptive Learning and Cognitive Tutoring - These are sophisticated systems which provide highly personalized learning experiences by using a predefined “knowledge map” of a specific discipline, such as algebra, to understand what prerequisite knowledge a student may be missing that is leading them to not be able to solve a problem or complete an assignment. The system will then help the learner address this missing knowledge and allow them to move on once they have demonstrated they have mastered the concept.
  • #46 The world wide web wasn’t exactly all new stuff, it just brought things together Many of the methodologies being applied to LA are not particularly new – text analysis, sentimenat analysis for example. Our tools are just quicker and do a consistent job.
  • #47 At Hull