Learning Analyics in a
Standardisation Context
Tore Hoel
Oslo and Akershus University of Applied Sciences
Norway
Open Forum, ISO/IEC JTC1/SC36 Melbourne meeting, June 2017
It is all about Access to Data
5 forces
that is
changing
our world
Imagecredit:https://www.flickr.com/photos/haydnseek/2534088367
Student data is seen as the ‘new black’,
as a resource to be mined, scraped &
abused/misused
Image credit: http://fpif.org/wp-content/uploads/2013/01/great-oil-swindle-peak-oil-world-energy-outlook.jpg
By: Paul Prinsloo (University of South Africa, Unisa)
Scottish Herald (May 2016)
The Independent (June 2016)
Page credit: http://insider.foxnews.com/2016/01/31/oklahoma-college-forcing-students-wear-fitbits By: Paul Prinsloo
(University of South Africa, Unisa)
Page credit: https://inews.co.uk/essentials/news/education/university-monitor-student-social-media-gauge-well/ By: Paul Prinsloo
(University of South Africa, Unisa)
Page credit: http://www.ft.com/cms/s/2/634624c6-312b-11e5-91ac-a5e17d9b4cff.html#slide0
By: Paul Prinsloo
(University of South Africa, Unisa)
Times Higher (Feb 2016)
Page credit: http://hechingerreport.org/students-worry-education-technology-might-predict-failure-chance-succeed/
By: Paul Prinsloo
(University of South Africa, Unisa)
Emotion detection in the classroom
16
From: Dale Davies The City of Liverpool College
Learning Analytics is…
«The measurement, collection, analysis and
reporting of data about learners and their
contexts, for purposes of understanding and
optimizing learning and the environments in
which it occurs.»
Society for Learning
Analytics Research (SoLAR)
Not Theoretical
Insights!
Not Reporting!Actionable intelligence!
“Learning analytics refers to the
measurement, collection, analysis and
reporting of data about the progress of
learners and the contexts in which learning
takes place”
(https://www.jisc.ac.uk/reports/learning-analytics-in-higher-education)
It is about ‘learning’ and
informed about our
beliefs about what
constitutes learning…
What do we measure?
How do we measure?
When do we measure?
What don’t we measure?
Where do we collect,
what, how often, for
what purpose and who
does the collection?
To whom
do we
report?
What
then?
Who will
act? What
if we
cannot
act?
What if
students
don’t act?
Who
analyses the
data, using
what, what
skills are
required,
who verifies
the
analysis?
How do we define ‘progress’? How do we
involve students in making sense of the
data, of their progress, of their journeys?
Do we assume that all their learning takes
place on the LMS? What do we assume
about their logins, their downloads, their
clicks?
By: Paul Prinsloo (University of South Africa, Unisa)
You don't give bread to
the baker's children…
Australia rules learning analytics!
ID14-3821: ENABLING CONNECTED LEARNING VIA OPEN SOURCE ANALYTICS IN
THE WILD: LEARNING ANALYTICS BEYOND THE LMS
This project is supported by the Australian Government’s office for learning and teaching
QUEENSLAND UNIVERSITY OF TECHNOLOGY:
Kirsty Kitto (Lead Investigator), Mandy Lupton, John Banks, Dann Mallet, Peter Bruza
UNIVERSITY OF SOUTH AUSTRALIA
Shane Dawson, Dragan Gašević (Uni of Edinburgh)
UNIVERSITY OF TECHNOLOGY SYDNEY
Simon Buckingham Shum (and now Kirsty Kitto!)
UNIVERSITY OF SYDNEY
Abelardo Pardo
UNIVERSITY OF TEXAS (ARLINGTON)
George Siemens
23
the connected learning analytics toolkit
Learning
Record Store
analysisxAPIscraping
learning analytics
admin & developersacademicsstudents
social media
26
a “go look at it” approach tends to fail
▪ students don’t apply knowledge
▪ limited reflection
▪ often blindly believe LA instead of
questioning it and reinterpreting
This would be facilitated by learning design
patterns
But these need to be very general!
http://www.rde.nsw.edu.au/lxd/2015/04/02/think-pair-share/https://netbeans.org/kb/docs/javaee/ecommerce/design.html
what should I do?
▪ authentic integration with assessment is necessary
▪ student facing LA great for formative scenarios
▪ 3 learning design patterns have been created to facilitate
- do-analyse-change-reflect
- active learning squared
- groupwork
Kitto, K., Lupton, M., Davis, K., Waters, Z. (2016). Incorporating student-facing learning analytics into
pedagogical practice. In S. Barker, S. Dawson, A. Pardo, & C. Colvin (Eds.), Show Me The Learning.
Proceedings ASCILITE 2016 Adelaide, pp. 338-347.
28
Thanks to Kirsty Kitto for allowing me to use these slides
ISO
Learning Analytics
standard
Benefits for the Learner
• tracking learning activities and progression;
• tracking emotion, motivation and learning-readiness;
• early detection of learner’s personal needs and preferences;
improved feedback from analysing activities and assessments;
• early detection of learner non-performance (mobilizing
remediation);
• personalized learning path and/or resources (recommendation).
ISO/IEC 20748-1
Benefits for the Teacher
• tracking learners/group activities and progression;
• adaptive teacher response to observed learner’s needs and behaviour;
• early detection of learner disengagement (mobilizing relevant support
actions);
• increasing the range of activities that can be used for assessing
performance;
• visualization of learning outcomes and activities for individuals and groups;
• providing evidence to help teacher improve the design of the learning
experience and resources.
ISO/IEC 20748-1
Benefits for the Institution
• tracking class/group activities and results;
• quality assurance monitoring;
• providing evidence to support the design of the learning
environment;
• providing evidence to support improved retention strategies;
• support for course planning.
ISO/IEC 20748-1
Learning Analytics Process Model
Privacy and Data Protection Policies
Developing a technical report on
LA Privacy & Data Protection Policies
ISO/IEC SC36 WG8
Sunday, 12 March 2017 meeting co-located with LAK17
Privacy & Data Protection Policies
What to specify?
• What information should be exchanged between institutions
(or third parties) and the learner to set up a LA session
• Consent mechanisms
• ISO/TS 17975:2015 Health informatics
• Accountability and governance of data used for LA
• ISO/IEC 29134:2017 Information technology — Security techniques —
Guidelines for privacy impact assessment
Do we measure the right things?
The role of standardisation
Semantic clarity
Identification of “At Risk” Students in OAAI
42
Images from OAAI accessed at: https://www.slideshare.net/SandeepMJayaprakash/apereo-oaai-presentation-final
Social Learning Analytics
43
Ferguson, R., & Shum, S. B. (2012). Social learning analytics: five approaches. In Proceedings of the 2nd international conference on learning
analytics and knowledge (pp. 23-33). ACM.
▪ social network analytics
▪ discourse analytics
▪ content analytics
▪ disposition analytics
▪ context analytics
Social Network Analysis (SNA)
44
▪ SNA has been used for
computational social
science for many years
▪ became very popular in LA
with the development of
the SNAPP tool
▪ more next session!
Aneesha Bakharia and Shane Dawson. 2011. SNAPP: a bird's-eye view of temporal participant interaction. In Proceedings of the 1st International
Conference on Learning Analytics and Knowledge (LAK '11). ACM, New York, NY, USA, 168-173.
Disposition analytics
45 Deakin Crick, R., S. Huang, A. Ahmed-Shafi and C. Goldspink (2015). Developing Resilient Agency in Learning: The Internal Structure of Learning
Power. British Journal of Educational Studies 63(2): 121- 160.General information available at: https://utscic.edu.au/tools/clara/
Discourse analytics
46 Kovanović, V., Joksimović, S., Waters, Z., Gašević, D., Kitto, K., Hatala, M., & Siemens, G. (2016). Towards automated content analysis of discussion
transcripts: a cognitive presence case. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (pp. 15-24). ACM.
Writing analytics
47
Writing Analytics focuses on the measurement and
analysis of written texts for the purpose of understanding
writing processes and products, in their educational
contexts, and improving the teaching and learning of
writing.
Simon Buckingham Shum, Simon Knight, Danielle McNamara, Laura Allen, Duygu Bektik, and Scott Crossley. 2016. Critical perspectives on writing
analytics. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York, NY, USA, 481-483.
reflective
writing
analytics
48
▪ Demo Available:
http://nlytx.io/2016/metacognition/index
.html
▪ Infrastructure since developed at UTS:
https://utscic.edu.au/projects/uts-
projects/a3r-authentic-assessment-
analytics-reflective-writing/
Gibson, A., Kitto, K., & Bruza, P. (2016). Towards the Discovery of Learner Metacognition
From Reflective Writing. Journal of Learning Analytics, 3(2), 22-36.
Gibson, A., Aitken, A., Sándor, Á., Buckingham Shum, S., Tsingos-Lucas, C., & Knight, S.
(2017, March). Reflective writing analytics for actionable feedback. In Proceedings of
the Seventh International Learning Analytics & Knowledge Conference (pp. 153-162).
ACM.
But different data comes from different
instructional modalities
49 Blikstein, P., & Worsley, M. (2016). Multimodal Learning Analytics and Education Data Mining: using computational technologies to measure complex
learning tasks. Journal of Learning Analytics, 3(2), 220-238.
Constructivist vs direct instruction models have very
different data collection possibilities
- Direct instruction relies on LMSs, eTexts, standardised
testing… has many tools for collecting data
- What about constructivist, experiential, inquiry based
approaches? Where is the data “in the wild”?
Multimodal Learning Analytics
50
Combines data from many different sources (i.e. not just from
formal learning environments like LMSs):
- text, handwriting, sketch… analysis
- speech, action and gesture analysis
- affective state analysis
- neurophysiological states (EEG)
- eye gaze
Blikstein, P., & Worsley, M. (2016). Multimodal Learning Analytics and Education Data Mining: using computational technologies to measure complex
learning tasks. Journal of Learning Analytics, 3(2), 220-238.
Storing and Processing
AI and Deep Learning in LA
What more? - Discuss!
谢谢您的关注
This work is licensed under a Creative Commons
Attribution 4.0 International (CC BY 4.0).
tore.hoel@hioa.no
Twitter: @tore
WeChat: Tore_no
Approaches to being ethical
In times of stability:
• Rules and regulatory
frameworks
• Terms & conditions
• By consent and/or contract
I times of rapid change:
• Potential of harm
• The scope of consent
• Negotiate and agree options
in case of unintended harm
Source: Presentation of Prinsloo & Slade, LAK17
From Tim McKay’s
keynote
What influences privacy
requirements for LA?
Privacy frameworks
OECD APEC EU GDPR
Preventing Harm
Lawfulness, Fairness and
Transparency
Collection Limitation Collection Limitation Data Minimisation
Purpose Specification Choice Purpose Limitation
Use Limitation Uses of Personal Information Storage Limitation
Data Quality Integrity of Personal Information Integrity and Confidentiality
Openness Notice
Individual Participation Access & Correction Accuracy
Accountability Accountability Accountability
Security Safeguards Security Safeguards
Data Protection by Design and by
Default
New European Data Protection
Regulation (GDPR) for the digital age
• Consent for processing data: A clear
affirmative action
• Easy access to your own data (Data
Portability)
• Data breaches (e.g., hacking): Notice
without undue delay
• Right to be forgotten
• Data Protection by
Design and Data
Protection by Default
Published May 2016 –
National law in all
European countries
from 2018
GDPR ➔ Pedagogical Requirements
LA Processes GDPR Requirements Pedagogical Requirements
Learning activity Give information of processing operation
and purpose
Explicit formulation of the scope of LA
processes. Choice of metrics that give
answers to the pedagogical questions
that initiated the LA process.
Data collection Affirmative action of consent to data
collection
Support of learner agency
Data storage and
processing
Access to, and rectification or erasure of
personal data.
Exercise the right to be forgotten.
Pseudonymisation and risk assessment
Support of learner agency
Analysis Meaningful information about the logic
involved. Information of profiling, e.g.,
predictive modeling
Support of learner agency and
understanding of learning context
Visualisation General requirements about
transparency and communication
Selection of salient issues for
pedagogical intervention
Feedback actions Information about the significance and
envisaged consequences of data
processing
Pedagogical intervention, relating
actions to pedagogical goals
GDPR inspired system
requirements
• Right to be informed
• Right to access
• Right to rectification
• Right to erasure
• Right to restrict processing
• Right to data portability
• Right to object
• Right related to automated
decision making and
profiling
• Accountability and
governance
• Breach notification
• Transfer of data (outside of
EU)
• Data Protection by Design
and by Default
Right to be informed
• The learner will know…
• What is the purpose of LA session
• What data are collected
• How data are stored and processed
• Principles for processing (predictive models / algorithms…)
• What visualisations
• Technical feedback actions designed for the LA process
Automated decision making
/ profiling
• Right to not to be subject to decisions when based
on automated processing
• Learner must be able to…
• …obtain human intervention
• …express their point of view
• …obtain explanation of decisions and able to
challenge them
Data Protection by Design
and by Default
• A simple
checkbox will
not do any more
• Open each sub
process of LA up
for discussion
related to data
protection

Learning analytics in a standardisation context

  • 1.
    Learning Analyics ina Standardisation Context Tore Hoel Oslo and Akershus University of Applied Sciences Norway Open Forum, ISO/IEC JTC1/SC36 Melbourne meeting, June 2017
  • 2.
    It is allabout Access to Data 5 forces that is changing our world
  • 8.
    Imagecredit:https://www.flickr.com/photos/haydnseek/2534088367 Student data isseen as the ‘new black’, as a resource to be mined, scraped & abused/misused Image credit: http://fpif.org/wp-content/uploads/2013/01/great-oil-swindle-peak-oil-world-energy-outlook.jpg By: Paul Prinsloo (University of South Africa, Unisa)
  • 9.
  • 10.
  • 11.
  • 12.
  • 13.
  • 14.
  • 15.
  • 16.
    Emotion detection inthe classroom 16 From: Dale Davies The City of Liverpool College
  • 17.
    Learning Analytics is… «Themeasurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.» Society for Learning Analytics Research (SoLAR) Not Theoretical Insights! Not Reporting!Actionable intelligence!
  • 18.
    “Learning analytics refersto the measurement, collection, analysis and reporting of data about the progress of learners and the contexts in which learning takes place” (https://www.jisc.ac.uk/reports/learning-analytics-in-higher-education) It is about ‘learning’ and informed about our beliefs about what constitutes learning… What do we measure? How do we measure? When do we measure? What don’t we measure? Where do we collect, what, how often, for what purpose and who does the collection? To whom do we report? What then? Who will act? What if we cannot act? What if students don’t act? Who analyses the data, using what, what skills are required, who verifies the analysis? How do we define ‘progress’? How do we involve students in making sense of the data, of their progress, of their journeys? Do we assume that all their learning takes place on the LMS? What do we assume about their logins, their downloads, their clicks? By: Paul Prinsloo (University of South Africa, Unisa)
  • 19.
    You don't givebread to the baker's children… Australia rules learning analytics!
  • 20.
    ID14-3821: ENABLING CONNECTEDLEARNING VIA OPEN SOURCE ANALYTICS IN THE WILD: LEARNING ANALYTICS BEYOND THE LMS This project is supported by the Australian Government’s office for learning and teaching QUEENSLAND UNIVERSITY OF TECHNOLOGY: Kirsty Kitto (Lead Investigator), Mandy Lupton, John Banks, Dann Mallet, Peter Bruza UNIVERSITY OF SOUTH AUSTRALIA Shane Dawson, Dragan Gašević (Uni of Edinburgh) UNIVERSITY OF TECHNOLOGY SYDNEY Simon Buckingham Shum (and now Kirsty Kitto!) UNIVERSITY OF SYDNEY Abelardo Pardo UNIVERSITY OF TEXAS (ARLINGTON) George Siemens 23
  • 21.
    the connected learninganalytics toolkit Learning Record Store analysisxAPIscraping learning analytics admin & developersacademicsstudents social media
  • 22.
    26 a “go lookat it” approach tends to fail ▪ students don’t apply knowledge ▪ limited reflection ▪ often blindly believe LA instead of questioning it and reinterpreting
  • 23.
    This would befacilitated by learning design patterns But these need to be very general! http://www.rde.nsw.edu.au/lxd/2015/04/02/think-pair-share/https://netbeans.org/kb/docs/javaee/ecommerce/design.html
  • 24.
    what should Ido? ▪ authentic integration with assessment is necessary ▪ student facing LA great for formative scenarios ▪ 3 learning design patterns have been created to facilitate - do-analyse-change-reflect - active learning squared - groupwork Kitto, K., Lupton, M., Davis, K., Waters, Z. (2016). Incorporating student-facing learning analytics into pedagogical practice. In S. Barker, S. Dawson, A. Pardo, & C. Colvin (Eds.), Show Me The Learning. Proceedings ASCILITE 2016 Adelaide, pp. 338-347. 28 Thanks to Kirsty Kitto for allowing me to use these slides
  • 25.
  • 26.
    Benefits for theLearner • tracking learning activities and progression; • tracking emotion, motivation and learning-readiness; • early detection of learner’s personal needs and preferences; improved feedback from analysing activities and assessments; • early detection of learner non-performance (mobilizing remediation); • personalized learning path and/or resources (recommendation). ISO/IEC 20748-1
  • 27.
    Benefits for theTeacher • tracking learners/group activities and progression; • adaptive teacher response to observed learner’s needs and behaviour; • early detection of learner disengagement (mobilizing relevant support actions); • increasing the range of activities that can be used for assessing performance; • visualization of learning outcomes and activities for individuals and groups; • providing evidence to help teacher improve the design of the learning experience and resources. ISO/IEC 20748-1
  • 28.
    Benefits for theInstitution • tracking class/group activities and results; • quality assurance monitoring; • providing evidence to support the design of the learning environment; • providing evidence to support improved retention strategies; • support for course planning. ISO/IEC 20748-1
  • 29.
    Learning Analytics ProcessModel Privacy and Data Protection Policies
  • 30.
    Developing a technicalreport on LA Privacy & Data Protection Policies ISO/IEC SC36 WG8 Sunday, 12 March 2017 meeting co-located with LAK17
  • 31.
    Privacy & DataProtection Policies
  • 32.
    What to specify? •What information should be exchanged between institutions (or third parties) and the learner to set up a LA session • Consent mechanisms • ISO/TS 17975:2015 Health informatics • Accountability and governance of data used for LA • ISO/IEC 29134:2017 Information technology — Security techniques — Guidelines for privacy impact assessment
  • 33.
    Do we measurethe right things?
  • 36.
    The role ofstandardisation
  • 37.
  • 38.
    Identification of “AtRisk” Students in OAAI 42 Images from OAAI accessed at: https://www.slideshare.net/SandeepMJayaprakash/apereo-oaai-presentation-final
  • 39.
    Social Learning Analytics 43 Ferguson,R., & Shum, S. B. (2012). Social learning analytics: five approaches. In Proceedings of the 2nd international conference on learning analytics and knowledge (pp. 23-33). ACM. ▪ social network analytics ▪ discourse analytics ▪ content analytics ▪ disposition analytics ▪ context analytics
  • 40.
    Social Network Analysis(SNA) 44 ▪ SNA has been used for computational social science for many years ▪ became very popular in LA with the development of the SNAPP tool ▪ more next session! Aneesha Bakharia and Shane Dawson. 2011. SNAPP: a bird's-eye view of temporal participant interaction. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge (LAK '11). ACM, New York, NY, USA, 168-173.
  • 41.
    Disposition analytics 45 DeakinCrick, R., S. Huang, A. Ahmed-Shafi and C. Goldspink (2015). Developing Resilient Agency in Learning: The Internal Structure of Learning Power. British Journal of Educational Studies 63(2): 121- 160.General information available at: https://utscic.edu.au/tools/clara/
  • 42.
    Discourse analytics 46 Kovanović,V., Joksimović, S., Waters, Z., Gašević, D., Kitto, K., Hatala, M., & Siemens, G. (2016). Towards automated content analysis of discussion transcripts: a cognitive presence case. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (pp. 15-24). ACM.
  • 43.
    Writing analytics 47 Writing Analyticsfocuses on the measurement and analysis of written texts for the purpose of understanding writing processes and products, in their educational contexts, and improving the teaching and learning of writing. Simon Buckingham Shum, Simon Knight, Danielle McNamara, Laura Allen, Duygu Bektik, and Scott Crossley. 2016. Critical perspectives on writing analytics. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK '16). ACM, New York, NY, USA, 481-483.
  • 44.
    reflective writing analytics 48 ▪ Demo Available: http://nlytx.io/2016/metacognition/index .html ▪Infrastructure since developed at UTS: https://utscic.edu.au/projects/uts- projects/a3r-authentic-assessment- analytics-reflective-writing/ Gibson, A., Kitto, K., & Bruza, P. (2016). Towards the Discovery of Learner Metacognition From Reflective Writing. Journal of Learning Analytics, 3(2), 22-36. Gibson, A., Aitken, A., Sándor, Á., Buckingham Shum, S., Tsingos-Lucas, C., & Knight, S. (2017, March). Reflective writing analytics for actionable feedback. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 153-162). ACM.
  • 45.
    But different datacomes from different instructional modalities 49 Blikstein, P., & Worsley, M. (2016). Multimodal Learning Analytics and Education Data Mining: using computational technologies to measure complex learning tasks. Journal of Learning Analytics, 3(2), 220-238. Constructivist vs direct instruction models have very different data collection possibilities - Direct instruction relies on LMSs, eTexts, standardised testing… has many tools for collecting data - What about constructivist, experiential, inquiry based approaches? Where is the data “in the wild”?
  • 46.
    Multimodal Learning Analytics 50 Combinesdata from many different sources (i.e. not just from formal learning environments like LMSs): - text, handwriting, sketch… analysis - speech, action and gesture analysis - affective state analysis - neurophysiological states (EEG) - eye gaze Blikstein, P., & Worsley, M. (2016). Multimodal Learning Analytics and Education Data Mining: using computational technologies to measure complex learning tasks. Journal of Learning Analytics, 3(2), 220-238.
  • 47.
  • 48.
    AI and DeepLearning in LA What more? - Discuss!
  • 49.
    谢谢您的关注 This work islicensed under a Creative Commons Attribution 4.0 International (CC BY 4.0). tore.hoel@hioa.no Twitter: @tore WeChat: Tore_no
  • 50.
    Approaches to beingethical In times of stability: • Rules and regulatory frameworks • Terms & conditions • By consent and/or contract I times of rapid change: • Potential of harm • The scope of consent • Negotiate and agree options in case of unintended harm Source: Presentation of Prinsloo & Slade, LAK17
  • 51.
  • 52.
  • 53.
    Privacy frameworks OECD APECEU GDPR Preventing Harm Lawfulness, Fairness and Transparency Collection Limitation Collection Limitation Data Minimisation Purpose Specification Choice Purpose Limitation Use Limitation Uses of Personal Information Storage Limitation Data Quality Integrity of Personal Information Integrity and Confidentiality Openness Notice Individual Participation Access & Correction Accuracy Accountability Accountability Accountability Security Safeguards Security Safeguards Data Protection by Design and by Default
  • 54.
    New European DataProtection Regulation (GDPR) for the digital age • Consent for processing data: A clear affirmative action • Easy access to your own data (Data Portability) • Data breaches (e.g., hacking): Notice without undue delay • Right to be forgotten • Data Protection by Design and Data Protection by Default Published May 2016 – National law in all European countries from 2018
  • 55.
    GDPR ➔ PedagogicalRequirements LA Processes GDPR Requirements Pedagogical Requirements Learning activity Give information of processing operation and purpose Explicit formulation of the scope of LA processes. Choice of metrics that give answers to the pedagogical questions that initiated the LA process. Data collection Affirmative action of consent to data collection Support of learner agency Data storage and processing Access to, and rectification or erasure of personal data. Exercise the right to be forgotten. Pseudonymisation and risk assessment Support of learner agency Analysis Meaningful information about the logic involved. Information of profiling, e.g., predictive modeling Support of learner agency and understanding of learning context Visualisation General requirements about transparency and communication Selection of salient issues for pedagogical intervention Feedback actions Information about the significance and envisaged consequences of data processing Pedagogical intervention, relating actions to pedagogical goals
  • 56.
    GDPR inspired system requirements •Right to be informed • Right to access • Right to rectification • Right to erasure • Right to restrict processing • Right to data portability • Right to object • Right related to automated decision making and profiling • Accountability and governance • Breach notification • Transfer of data (outside of EU) • Data Protection by Design and by Default
  • 57.
    Right to beinformed • The learner will know… • What is the purpose of LA session • What data are collected • How data are stored and processed • Principles for processing (predictive models / algorithms…) • What visualisations • Technical feedback actions designed for the LA process
  • 58.
    Automated decision making /profiling • Right to not to be subject to decisions when based on automated processing • Learner must be able to… • …obtain human intervention • …express their point of view • …obtain explanation of decisions and able to challenge them
  • 59.
    Data Protection byDesign and by Default • A simple checkbox will not do any more • Open each sub process of LA up for discussion related to data protection