Sarah Davies, Head of higher education and student experience, Jisc
Data-informed blended learning design & delivery
Learning design and curriculum design
Image CC BY-SA 2.0 Aspire-edu
» The design of learning as a purposeful
activity with a planned outcome
» Course, modules, lessons, activities
» Totally informal to structured and
supported
A spectrum …
Learning analytics
» The application of big data
techniques such as machine based
learning and data mining to help
learners and institutions meet their
goals
– improve retention
– improve achievement
– reduce differential outcomes
Image CC BY-SA 2.0 Aspire-edu
How do they interact?
Curriculum
design
Data
Learning
analytics
26/04/2018 Educational analytics 4
Data
Collection
Data
Storage
and Analysis
Presentation
and Action
Jisc Learning Analytics open architecture: core
Alert and Intervention
system
Other Staff
Dashboards
Consent Service
(tbc)
StudentApp:
StudyGoal
Jisc Learning
AnalyticsPredictor
Learning
Data Hub
StudentRecords VLE Library
Staff dashboardsin
Data Explorer
Self DeclaredData Attendance, Presence, Equipment use etc….
Data Aggregator
UDD TransformationToolkit Pluginsand/or Universal xAPITranslator
Challenges and enablers
Legal and ethical
issues
Data wrangling
Culture and
processes
Estates and
technology
Strategic vision and
appetite
Data capability
Data warehouse
Process redesign
Course redesign
Institutionalbarriers
Institutionalenablers
26/04/2018
Staff digital capabilities
ICT proficiency
Information,
data and
media literacies
Digital
learning and
development
Digital creation,
problem solving and
innovation
Communication,
collaboration and
participation
Digital identity
and wellbeing
Improving the student learning experience
26/04/2018 8
Designing learning and assessment…
» Based on interviews & case studies
» Aimed at FE and HE
» Freely available tools & techniques
» ji.sc/design-learn-assess
…in a digital age
Image CC BY-NC-ND Jisc
Where do you use data in this lifecycle?
Review
context
Explore
possibilities
Design
course/
module
Deliver,
assess,
feedback
Review
course
delivery
26/04/2018 Educational analytics 10
When do you use data?
What for?
What data do you use?
How often?
How do you interpret it?
What would help make it
easier?
Potential use cases - curriculum analytics
»Understanding what works
»What are the learners using?
»What are they learning?
»What works for my incoming learners?
»Recommender for students
26/04/2018 Educational analytics 11
Understanding what works
»Analysis of patterns of achievement across large cohorts
(faculty, institution) against recorded aspects of
curriculum design (assessment structure, online activities)
to derive good practice recommendations (in policies and
course design) for student success
› And assess impact of implementation of those recommendations
26/04/2018 Educational analytics 12
What are the learners using?
»Cohort activity dashboard for course team showing hot
and cold spots in engagement with materials across
multiple sources (VLE, reading lists, specialist systems…)
– could be analysed against assessment marks.
› Enables lecturers to adjust signposting to key resources or re-
teach cold spots during current run of module
› Enables automatic recommendations for students on which key
resources to use next
› Enables adjustments/redesign between presentations of module
26/04/2018 Educational analytics 13
What are they learning?
»Module-level analysis of which curriculum and academic
practice areas students are stronger and weaker in on
average, based on detailed assessment data (eg marks
against rubric, or against questions in tests, common
mistakes, automated analysis of assignments??)
› Used to adapt teaching to focus on key misunderstanding/gaps
› Signal support in key areas of academic practice
26/04/2018 Educational analytics 14
What works for my incoming learners?
»Analysis of data on an incoming cohort, based on their
engagement patterns on previous modules, or for first
years, other characteristics eg incoming qualifications,
family history of HE, distance to study…
› Enabling adaptation of learning design for incoming cohort.
26/04/2018 Educational analytics 15
Recommender for students
»Recommendations to students for key learning activities
to do or resources to read/use based on analysis of
patterns of activity of previous successful students.
› What do students want to see to help them steer a path through
their studies?
26/04/2018 Educational analytics 16
Discuss in your groups:
»Are you already doing any of these?
»Would your institution be interested in exploring these
kinds of use cases? Which ones?
»Are there any that are totally implausible or infeasible?
»How do we best engage course teams and students in
this? Any other key stakeholders?
»What could Jisc do to support you in moving towards any
of these?
26/04/2018 Educational analytics 17
jisc.ac.uk
Get in touch
https://analytics.jiscinvolve.org/
Sarah Davies
Head of HE and student experience
sarah.davies@jisc.ac.uk
26/04/2018

Data-informed blended design and delivery

  • 1.
    Sarah Davies, Headof higher education and student experience, Jisc Data-informed blended learning design & delivery
  • 2.
    Learning design andcurriculum design Image CC BY-SA 2.0 Aspire-edu » The design of learning as a purposeful activity with a planned outcome » Course, modules, lessons, activities » Totally informal to structured and supported A spectrum …
  • 3.
    Learning analytics » Theapplication of big data techniques such as machine based learning and data mining to help learners and institutions meet their goals – improve retention – improve achievement – reduce differential outcomes Image CC BY-SA 2.0 Aspire-edu
  • 4.
    How do theyinteract? Curriculum design Data Learning analytics 26/04/2018 Educational analytics 4
  • 5.
    Data Collection Data Storage and Analysis Presentation and Action JiscLearning Analytics open architecture: core Alert and Intervention system Other Staff Dashboards Consent Service (tbc) StudentApp: StudyGoal Jisc Learning AnalyticsPredictor Learning Data Hub StudentRecords VLE Library Staff dashboardsin Data Explorer Self DeclaredData Attendance, Presence, Equipment use etc…. Data Aggregator UDD TransformationToolkit Pluginsand/or Universal xAPITranslator
  • 6.
    Challenges and enablers Legaland ethical issues Data wrangling Culture and processes Estates and technology Strategic vision and appetite Data capability Data warehouse Process redesign Course redesign Institutionalbarriers Institutionalenablers 26/04/2018
  • 7.
    Staff digital capabilities ICTproficiency Information, data and media literacies Digital learning and development Digital creation, problem solving and innovation Communication, collaboration and participation Digital identity and wellbeing
  • 8.
    Improving the studentlearning experience 26/04/2018 8
  • 9.
    Designing learning andassessment… » Based on interviews & case studies » Aimed at FE and HE » Freely available tools & techniques » ji.sc/design-learn-assess …in a digital age Image CC BY-NC-ND Jisc
  • 10.
    Where do youuse data in this lifecycle? Review context Explore possibilities Design course/ module Deliver, assess, feedback Review course delivery 26/04/2018 Educational analytics 10 When do you use data? What for? What data do you use? How often? How do you interpret it? What would help make it easier?
  • 11.
    Potential use cases- curriculum analytics »Understanding what works »What are the learners using? »What are they learning? »What works for my incoming learners? »Recommender for students 26/04/2018 Educational analytics 11
  • 12.
    Understanding what works »Analysisof patterns of achievement across large cohorts (faculty, institution) against recorded aspects of curriculum design (assessment structure, online activities) to derive good practice recommendations (in policies and course design) for student success › And assess impact of implementation of those recommendations 26/04/2018 Educational analytics 12
  • 13.
    What are thelearners using? »Cohort activity dashboard for course team showing hot and cold spots in engagement with materials across multiple sources (VLE, reading lists, specialist systems…) – could be analysed against assessment marks. › Enables lecturers to adjust signposting to key resources or re- teach cold spots during current run of module › Enables automatic recommendations for students on which key resources to use next › Enables adjustments/redesign between presentations of module 26/04/2018 Educational analytics 13
  • 14.
    What are theylearning? »Module-level analysis of which curriculum and academic practice areas students are stronger and weaker in on average, based on detailed assessment data (eg marks against rubric, or against questions in tests, common mistakes, automated analysis of assignments??) › Used to adapt teaching to focus on key misunderstanding/gaps › Signal support in key areas of academic practice 26/04/2018 Educational analytics 14
  • 15.
    What works formy incoming learners? »Analysis of data on an incoming cohort, based on their engagement patterns on previous modules, or for first years, other characteristics eg incoming qualifications, family history of HE, distance to study… › Enabling adaptation of learning design for incoming cohort. 26/04/2018 Educational analytics 15
  • 16.
    Recommender for students »Recommendationsto students for key learning activities to do or resources to read/use based on analysis of patterns of activity of previous successful students. › What do students want to see to help them steer a path through their studies? 26/04/2018 Educational analytics 16
  • 17.
    Discuss in yourgroups: »Are you already doing any of these? »Would your institution be interested in exploring these kinds of use cases? Which ones? »Are there any that are totally implausible or infeasible? »How do we best engage course teams and students in this? Any other key stakeholders? »What could Jisc do to support you in moving towards any of these? 26/04/2018 Educational analytics 17
  • 18.
    jisc.ac.uk Get in touch https://analytics.jiscinvolve.org/ SarahDavies Head of HE and student experience sarah.davies@jisc.ac.uk 26/04/2018

Editor's Notes

  • #3 Or could just do it as a hands up or discussion at the tables? Would be useful to know if anyone captures the learning designs/structures
  • #8 Jisc, responding to our stakeholders’ needs, have created tools and resources to support institutions in building their staff’s digital capabilities. Data capability was one of the key additions to the framework when we reviewed it in 2015. What kind of data capability staff need depends on role – there’s the organisational stuff that HESA highlight, down to interpreting and acting on the outputs of LA predictive modelling.
  • #9 Illustrates how, building on the foundation of the data we capture to support the basic retention and attainment use case for learning analytics, we can gather additional data that helps improve our physical and virtual estate, and our teaching and curriculum design and delivery. Include example of data-informed curriculum modification/design
  • #10 This is the ‘proper’ high-level lifecycle
  • #11 This is a rough and ready one used for convenience for this activity.