Learning
analytics:
developing an
action plan… …developing
a vision
Rebecca Ferguson
The Open University, UK
SLATE
September 2017
Innovating in technology-enhanced learning (TEL)
The TEL Complex
2
The many elements of
the ‘TEL Complex’ must
all be taken into account
as an innovation is
designed, developed
and embedded
Scanlon, E., Sharples, M., Fenton-O'Creevy, M., Fleck, J., Cooban, C.,
Ferguson, R., Cross, S. & Waterhouse, P. 2013. Beyond Prototypes. London: TEL Programme.
Priority areas for education and training
3
• Bringing together different sectors: higher education, schools & workplace learning
• Building enduring networks
• Helping to develop learning analytics capability
• Creating and sharing resources
• Developing visions of the future and agreeing how to work towards them
www.laceproject.eu
LAEP: learning analytics for
European educational policy
4
• What is the current state of
the art?
• What are the prospects for the
implementation of learning
analytics?
• What is the potential for
European policy to be used to
guide and support the take-up
and adaptation of learning
analytics to enhance
education in Europe? http://bit.ly/2jLfx9p
5
The Open University
open.ac.uk
Developing institutional strengths
The OU is developing its capabilities in 10 key areas
6
The university
needs world class
capability in data
science to
continually mine
the data and build
rapid prototypes of
simple tools, and a
clear pipeline for
the outputs to be
mainstreamed into
operations
We need to ensure we have the right architecture and processes
for collecting the right data and making it accessible for analytics
– we need a ‘big data’ mind-set
Benefits will be realised through
existing business processes
impacting on students directly
and through enhancement of
the student learning experience
– we will develop an ‘analytics
mind-set’ in
these areas
The strategic roadmap
will build these
capabilities prioritised
using the indicators and
drivers of student success
Relating design and outcomes
Learning design and analytics at the OU
7
Easily accessible OU data
Learning design and analytics at the OU
8
Learning analytics help us to identify
and make sense of patterns in the data
to improve our teaching, our learning
and our learning environments
Educators use analytics to…
• Monitor the learning process
• Explore student data
• Identify problems
• Discover patterns
• Find early indicators for success
• Find early indicators for poor marks or drop-out
• Assess usefulness of learning materials
• Increase awareness, reflect and self reflect
• Increase understanding of learning environments
• Intervene, supervise, advise and assist
• Improve teaching, resources and the environment
10
Dyckhoff, A. L., Lukarov, V., Muslim, A., Chatti, M. A., & Schroeder, U. (2013).
Supporting Action Research with Learning Analytics. Paper presented at LAK13.
Learners use analytics to…
• Monitor their own activities and interactions
• Monitor the learning process
• Compare their activity with that of others
• Increase awareness, reflect and self reflect
• Improve discussion participation
• Improve learning behaviour
• Improve performance
• Become better learners
• Learn!
11
Dyckhoff, A. L., Lukarov, V., Muslim, A., Chatti, M. A., & Schroeder, U. (2013).
Supporting Action Research with Learning Analytics. Paper presented at LAK13.
Rapid Outcomes Modelling Approach (ROMA)
The ROMA Framework
12
Ferguson, R., Macfadyen, L., Clow, D., Tynan, B., Alexander, S., & Dawson, S.. (2015). Setting learning analytics in
context: overcoming the barriers to large-scale adoption. Journal of Learning Analytics, 1(3), 120-144.
Adapted from: Young, J., & Mendizabal, E. (2009). Helping researchers become policy entrepreneurs: How to
develop engagement strategies for evidence‐based policy‐making. ODI Briefing Papers. London, UK: ODI.
Define (and
redefine)
your policy
objectives
What does success look like?
13
Academic analytics can guide future change
Student perspectives
● Overall, I am satisfied with the quality of this module
● Overall, I am satisfied with my study experience
● I would recommend this module to other students
● I was satisfied with the support provided by my tutor on this module
● I enjoyed studying this module
● This module met my expectations
Academic perspectives
● The students were well prepared
● The students met specified learning outcomes
● The students defined and achieved their own learning goals
University perspectives
● The module enhanced the university’s reputation
● The module aligned well with others
● The module generated income
What does success look like?
● Students demonstrate the skills necessary to network, collaborate,
browse and reflect
● Students show progress towards defined learning outcomes
● Students communicate well… when asked to collaborate
● Students access and share links… when encouraged to browse
● Students return to materials... when encouraged to reflect
● Students engage with course content
● Students seek out new challenges
● Students persist when the work is challenging
● Students persist in the face of failure
● Students ask for help… when they are stuck
after several attempts
● Students compare their learning strategies with those of experts
● Students adapt their learning strategies to resemble those of experts
14
Learning analytics help to identify appropriate interventions
Policy objectives
OU Strategic Analytics Investment Programme
15
Vision
To use and apply information
strategically to retain students and
enable them to progress and
achieve their study goals.
This vision requires
• Discursive changes
to the communication of data
and analytics
• Procedural changes
in how learners are supported
• Behavioural changes
associated with sustainable
change in learner support.
Define (and
redefine)
your policy
objectives
Political context
Mapping people and processes
16
Tynan, B. & Buckingham Shum, S. (2013). Designing systemic learning analytics at the Open University.
http://www.slideshare.net/sbs/designing‐systemic‐learning‐analytics‐at‐the‐open‐university
Key stakeholders
OU Strategic Analytics Investment Programme
17
Define
(and
redefine)
your policy
objectives
A community of stakeholders
working in different areas:
• Intervention and Evaluation
• Data Usability
• Ethics Framework
• Predictive Modelling
• Learning Experience Data
• Professional Data
• Student Tools
Key stakeholders are
• University administrators
• Students
• Educators
Desired behaviour changes
OU Strategic Analytics Investment Programme
18
Define
(and
redefine)
your policy
objectives
Vision
To use and apply information
strategically to retain students and
enable them to progress and
achieve their study goals.
Desired behaviour changes
• Staff will use and apply
information strategically
• Students will extend their
learning journeys
• Students will complete their
learning journeys
• Students will set learning goals
• Students will work effectively
towards study goals
Engagement strategy
OU Strategic Analytics Investment Programme
19
Define
(and
redefine)
your policy
objectives
• Data in action is provided to
stakeholders through a live portal,
enabling them to understand learner
behaviour and make adjustments
and interventions that will have an
immediate positive impact.
• Data on action is a more reflective
process that takes place after an
adjustment or intervention.
• Data for action takes advantage of
predictive modelling and innovation
in order to isolate particular
variables and make changes based
on a variety of analysis tools.
Internal capacity to effect change
OU Strategic Analytics Investment Programme
20
Define
(and
redefine)
your policy
objectives
Includes
• Recruitment
• Capacity building
• Developing an ethical
framework for the use of
learning analytics.
Monitoring
OU Strategic Analytics Investment Programme
21
Tynan, B. & Buckingham Shum, S. (2013). Designing systemic learning analytics at the Open University.
http://www.slideshare.net/sbs/designing‐systemic‐learning‐analytics‐at‐the‐open‐university
22
What is
your
vision?
Pedagogy We have a social duty to
facilitate and provide
opportunities for learners
to achieve their full
potential
• Why do we educate people?
• How do people learn?
• What pedagogic outcomes are we
trying to achieve?
• How can we measure those outcomes?
Learning is not only about
success – it is about
learning from failure
there is a time for
learners to be confronted
in order for transformation
and growth to occur
We need to nurture
rich, reflective
communities in both
teaching and learning
Smart houses, wearable
technology, the Internet of
Things and face
recognition are
increasingly part of
everyday life
Hard to believe that there
will be enough processing
power to do this, but I
guess people always say
that when something is
ten years away
A new government
authority that acts as a
trusted clearing house for
data and analytics
Complexity
• How can we understand the
internal process of learning by
measuring external actions?
• How do we engage a wide
range of stakeholders?
• How do we process huge
amounts of data from diverse
sources?
Ethics
• We need some form of
regulation in this area
• Control of data has
ethical implications
• Encourage awareness of
how data are used and
how analytics function
• Focusing on data as a
valuable commodity can
lead to unethical
practices
The key is to establish
the notion that each of us
own our own data: the
companies do not
institutional rules and
regulations must exist
and should meet
certain criteria
As long as the
data is
anonymous
data should be
allowed to be
used in these
kinds of
applications
without any
consent
One of the purposes of LA is to
empower the teachers to
provide better learning for the
individual learners
It is even worse to put
that control in the hands
of system designers and
programmers, thus
embedding their
assumptions and beliefs
• Who should control the data?
• Who should control the learning
and teaching process?
• Who sets goals for learners
and teachers?
• Who needs to understand
the analytic process?
Power
if tracking and monitoring
are used to foster and
support education and
learning, it might be
desirable. If it is used to
monitor and control and
to enforce power it is not
desirable
drawing on previous
legislation in the areas of
privacy, child protection,
data protection,
consumer protection, and
the use of personal data
in medical research
It must be handled as a human
right in the 21st century that
every single person should have
the power to decide, when + how
+ for what purpose + for which
timeframe + ... his/her personal
data can/cannot be used
• Need to regulate protection,
ownership and storage of data
• Need new policies on
education, ethics, privacy and
assessment
• Need to decide how this
regulation is developed and
enforced
Regulation
Very little credible
research has
demonstrated any real
large-scale benefits to
learners or institutions
The use of LA
applications in real
practice has be
conscious of the
limitations of any
analysis, and apply
them in a way that is
coherent with the
limitations of the
approach
we MUST be willing to
unpack the algorithms.
Academics are extremely
unlikely to accept 'black box'
predictive tools - it goes
against the very principles of
critical thought
Validity
How can we be sure that the
results generated by learning
analytics are valid, reliable
and generalisable?
Affect
• Bear in mind what
engages and motivates
teachers and learners
• Be aware that there is
discomfort and unease
about various aspects of
learning analytics
the real fuel of
Learning is
motivation and
volition, which you
cannot capture with
external sensors
I might be an alarmist, but there
is too much at stake: from
developing an underclass of
limited-dimension robiticized
learners, to propaganda-fed
righteous fanatics, an
automated, corrupted learning
environment puts us on a path
to an Orwellian future
autonomy begets
engagement, motivation,
persistence, relevance
Visions
of the
future
bit.ly/28X5tq7
Which vision are you working towards?
31
Slides online at www.slideshare.net/R3beccaF
Rebecca Ferguson @R3beccaF
http://r3beccaf.wordpress.com/

Learning analytics: developing an action plan ... developing a vision

  • 1.
    Learning analytics: developing an action plan……developing a vision Rebecca Ferguson The Open University, UK SLATE September 2017
  • 2.
    Innovating in technology-enhancedlearning (TEL) The TEL Complex 2 The many elements of the ‘TEL Complex’ must all be taken into account as an innovation is designed, developed and embedded Scanlon, E., Sharples, M., Fenton-O'Creevy, M., Fleck, J., Cooban, C., Ferguson, R., Cross, S. & Waterhouse, P. 2013. Beyond Prototypes. London: TEL Programme.
  • 3.
    Priority areas foreducation and training 3 • Bringing together different sectors: higher education, schools & workplace learning • Building enduring networks • Helping to develop learning analytics capability • Creating and sharing resources • Developing visions of the future and agreeing how to work towards them www.laceproject.eu
  • 4.
    LAEP: learning analyticsfor European educational policy 4 • What is the current state of the art? • What are the prospects for the implementation of learning analytics? • What is the potential for European policy to be used to guide and support the take-up and adaptation of learning analytics to enhance education in Europe? http://bit.ly/2jLfx9p
  • 5.
  • 6.
    Developing institutional strengths TheOU is developing its capabilities in 10 key areas 6 The university needs world class capability in data science to continually mine the data and build rapid prototypes of simple tools, and a clear pipeline for the outputs to be mainstreamed into operations We need to ensure we have the right architecture and processes for collecting the right data and making it accessible for analytics – we need a ‘big data’ mind-set Benefits will be realised through existing business processes impacting on students directly and through enhancement of the student learning experience – we will develop an ‘analytics mind-set’ in these areas The strategic roadmap will build these capabilities prioritised using the indicators and drivers of student success
  • 7.
    Relating design andoutcomes Learning design and analytics at the OU 7
  • 8.
    Easily accessible OUdata Learning design and analytics at the OU 8
  • 9.
    Learning analytics helpus to identify and make sense of patterns in the data to improve our teaching, our learning and our learning environments
  • 10.
    Educators use analyticsto… • Monitor the learning process • Explore student data • Identify problems • Discover patterns • Find early indicators for success • Find early indicators for poor marks or drop-out • Assess usefulness of learning materials • Increase awareness, reflect and self reflect • Increase understanding of learning environments • Intervene, supervise, advise and assist • Improve teaching, resources and the environment 10 Dyckhoff, A. L., Lukarov, V., Muslim, A., Chatti, M. A., & Schroeder, U. (2013). Supporting Action Research with Learning Analytics. Paper presented at LAK13.
  • 11.
    Learners use analyticsto… • Monitor their own activities and interactions • Monitor the learning process • Compare their activity with that of others • Increase awareness, reflect and self reflect • Improve discussion participation • Improve learning behaviour • Improve performance • Become better learners • Learn! 11 Dyckhoff, A. L., Lukarov, V., Muslim, A., Chatti, M. A., & Schroeder, U. (2013). Supporting Action Research with Learning Analytics. Paper presented at LAK13.
  • 12.
    Rapid Outcomes ModellingApproach (ROMA) The ROMA Framework 12 Ferguson, R., Macfadyen, L., Clow, D., Tynan, B., Alexander, S., & Dawson, S.. (2015). Setting learning analytics in context: overcoming the barriers to large-scale adoption. Journal of Learning Analytics, 1(3), 120-144. Adapted from: Young, J., & Mendizabal, E. (2009). Helping researchers become policy entrepreneurs: How to develop engagement strategies for evidence‐based policy‐making. ODI Briefing Papers. London, UK: ODI. Define (and redefine) your policy objectives
  • 13.
    What does successlook like? 13 Academic analytics can guide future change Student perspectives ● Overall, I am satisfied with the quality of this module ● Overall, I am satisfied with my study experience ● I would recommend this module to other students ● I was satisfied with the support provided by my tutor on this module ● I enjoyed studying this module ● This module met my expectations Academic perspectives ● The students were well prepared ● The students met specified learning outcomes ● The students defined and achieved their own learning goals University perspectives ● The module enhanced the university’s reputation ● The module aligned well with others ● The module generated income
  • 14.
    What does successlook like? ● Students demonstrate the skills necessary to network, collaborate, browse and reflect ● Students show progress towards defined learning outcomes ● Students communicate well… when asked to collaborate ● Students access and share links… when encouraged to browse ● Students return to materials... when encouraged to reflect ● Students engage with course content ● Students seek out new challenges ● Students persist when the work is challenging ● Students persist in the face of failure ● Students ask for help… when they are stuck after several attempts ● Students compare their learning strategies with those of experts ● Students adapt their learning strategies to resemble those of experts 14 Learning analytics help to identify appropriate interventions
  • 15.
    Policy objectives OU StrategicAnalytics Investment Programme 15 Vision To use and apply information strategically to retain students and enable them to progress and achieve their study goals. This vision requires • Discursive changes to the communication of data and analytics • Procedural changes in how learners are supported • Behavioural changes associated with sustainable change in learner support. Define (and redefine) your policy objectives
  • 16.
    Political context Mapping peopleand processes 16 Tynan, B. & Buckingham Shum, S. (2013). Designing systemic learning analytics at the Open University. http://www.slideshare.net/sbs/designing‐systemic‐learning‐analytics‐at‐the‐open‐university
  • 17.
    Key stakeholders OU StrategicAnalytics Investment Programme 17 Define (and redefine) your policy objectives A community of stakeholders working in different areas: • Intervention and Evaluation • Data Usability • Ethics Framework • Predictive Modelling • Learning Experience Data • Professional Data • Student Tools Key stakeholders are • University administrators • Students • Educators
  • 18.
    Desired behaviour changes OUStrategic Analytics Investment Programme 18 Define (and redefine) your policy objectives Vision To use and apply information strategically to retain students and enable them to progress and achieve their study goals. Desired behaviour changes • Staff will use and apply information strategically • Students will extend their learning journeys • Students will complete their learning journeys • Students will set learning goals • Students will work effectively towards study goals
  • 19.
    Engagement strategy OU StrategicAnalytics Investment Programme 19 Define (and redefine) your policy objectives • Data in action is provided to stakeholders through a live portal, enabling them to understand learner behaviour and make adjustments and interventions that will have an immediate positive impact. • Data on action is a more reflective process that takes place after an adjustment or intervention. • Data for action takes advantage of predictive modelling and innovation in order to isolate particular variables and make changes based on a variety of analysis tools.
  • 20.
    Internal capacity toeffect change OU Strategic Analytics Investment Programme 20 Define (and redefine) your policy objectives Includes • Recruitment • Capacity building • Developing an ethical framework for the use of learning analytics.
  • 21.
    Monitoring OU Strategic AnalyticsInvestment Programme 21 Tynan, B. & Buckingham Shum, S. (2013). Designing systemic learning analytics at the Open University. http://www.slideshare.net/sbs/designing‐systemic‐learning‐analytics‐at‐the‐open‐university
  • 22.
  • 23.
    Pedagogy We havea social duty to facilitate and provide opportunities for learners to achieve their full potential • Why do we educate people? • How do people learn? • What pedagogic outcomes are we trying to achieve? • How can we measure those outcomes? Learning is not only about success – it is about learning from failure there is a time for learners to be confronted in order for transformation and growth to occur We need to nurture rich, reflective communities in both teaching and learning
  • 24.
    Smart houses, wearable technology,the Internet of Things and face recognition are increasingly part of everyday life Hard to believe that there will be enough processing power to do this, but I guess people always say that when something is ten years away A new government authority that acts as a trusted clearing house for data and analytics Complexity • How can we understand the internal process of learning by measuring external actions? • How do we engage a wide range of stakeholders? • How do we process huge amounts of data from diverse sources?
  • 25.
    Ethics • We needsome form of regulation in this area • Control of data has ethical implications • Encourage awareness of how data are used and how analytics function • Focusing on data as a valuable commodity can lead to unethical practices The key is to establish the notion that each of us own our own data: the companies do not institutional rules and regulations must exist and should meet certain criteria As long as the data is anonymous data should be allowed to be used in these kinds of applications without any consent
  • 26.
    One of thepurposes of LA is to empower the teachers to provide better learning for the individual learners It is even worse to put that control in the hands of system designers and programmers, thus embedding their assumptions and beliefs • Who should control the data? • Who should control the learning and teaching process? • Who sets goals for learners and teachers? • Who needs to understand the analytic process? Power if tracking and monitoring are used to foster and support education and learning, it might be desirable. If it is used to monitor and control and to enforce power it is not desirable
  • 27.
    drawing on previous legislationin the areas of privacy, child protection, data protection, consumer protection, and the use of personal data in medical research It must be handled as a human right in the 21st century that every single person should have the power to decide, when + how + for what purpose + for which timeframe + ... his/her personal data can/cannot be used • Need to regulate protection, ownership and storage of data • Need new policies on education, ethics, privacy and assessment • Need to decide how this regulation is developed and enforced Regulation
  • 28.
    Very little credible researchhas demonstrated any real large-scale benefits to learners or institutions The use of LA applications in real practice has be conscious of the limitations of any analysis, and apply them in a way that is coherent with the limitations of the approach we MUST be willing to unpack the algorithms. Academics are extremely unlikely to accept 'black box' predictive tools - it goes against the very principles of critical thought Validity How can we be sure that the results generated by learning analytics are valid, reliable and generalisable?
  • 29.
    Affect • Bear inmind what engages and motivates teachers and learners • Be aware that there is discomfort and unease about various aspects of learning analytics the real fuel of Learning is motivation and volition, which you cannot capture with external sensors I might be an alarmist, but there is too much at stake: from developing an underclass of limited-dimension robiticized learners, to propaganda-fed righteous fanatics, an automated, corrupted learning environment puts us on a path to an Orwellian future autonomy begets engagement, motivation, persistence, relevance
  • 30.
  • 31.
    31 Slides online atwww.slideshare.net/R3beccaF Rebecca Ferguson @R3beccaF http://r3beccaf.wordpress.com/

Editor's Notes

  • #2 Introduction If you have attended the Learning Ananlytics Summer Institute (LASI Asia) this week, some of the early slides here will look familiar, but I am going to focus here much more on actions to be taken
  • #4 The Learning Analytics Community Exchange (LACE) project in Europe has been thinking about the future of learning analytics – which futures we want to work towards and which we want to avoid. To investigate this, we have carried out a Policy Delphi, a form of research designed to elicit a range of exert views on a topic. In this case, we developed eight provocations or visions of the future of learning analytics. Using a survey, we shared these with experts and practitioners around the world and asked them to comment on at least two visions in terms of desirability, feasibility, and actions that would need to be taken.
  • #6 Introduction to The Open University, to the Learning Analytics Community Exchange (LACE) project and to the Learning Analytics for European Educational Policy (LAEP) project.
  • #10 A rephrasing of that definition
  • #24 Analysis of these responses to our Policy Delphi helped us to identify seven major themes, with associated questions and issues. The first of these is pedagogy – a theorised approach to learning and teaching
  • #25 Second theme is complexity – lots of this will be difficult to do, but there are ways to develop this work, and precedents on which to build
  • #26 Ethics was a theme that came up in relation to all the provocations
  • #27 Power is a theme that has been less considered in relation to learning analytics – although sociologists are already querying the uses and implications of big data
  • #28 Regulation ties in with both power and ethics. If learning analytics are to work well, we need checks and balances in place
  • #29 Validity is an increasing concern as wel move away from small pilot projects to large-scale implementation
  • #30 And personal responses are also important. If people aren’t happy with the analytics or if they don’t trust the analytics, then problems arise. This was linked to a recurrent minor theme of alienation
  • #31 Reflection point
  • #32 Access the slides on Slideshare