Supporting Higher Education to Integrate
Learning Analytics
http://sheilaproject.eu/
Yi-Shan Tsai
University of Edinburgh
yi-shan.tsai@ed.ac.uk
@yi_shan_tsai
EUNIS Workshop
7th November 2017
Team
http://sheilaproject.eu/
Supporting Higher Education to
Integrate Learning Analytics
• The state of the art
• Direct engagement
with key stakeholders
• A comprehensive
policy framework
http://sheilaproject.eu/
Slide credit: Dragan Gašević (2017) Let’s get there! Towards policy for adoption of learning analytics. LSAC, Amsterdam, The Netherlands.
http://sheilaproject.eu/
The state of the art
Challenges, adoption and strategy
http://sheilaproject.eu/
Adoption challenges
1. Leadership for strategic implementation &
monitoring
2. Equal engagement with stakeholders
3. Pedagogy-based approaches to removing
learning barriers
4. Training to cultivate data literacy among primary
stakeholders
5. Evidence of impact
6. Context-based policies to address privacy &
ethics issues and other challenges
Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education –challenges and policies: a review of eight learning analytics policies.
InProceedings of the Seventh International Learning Analytics & Knowledge Conference(pp. 233-242).
http://sheilaproject.eu/
Essential features of a LA policy…
http://sheilaproject.eu/
CHALLENGES
Experts’ perspectives
LA adoption in Europe
• Institutional interviews: 16 countries, 51 HEIs, 64
interviews, 78 participants
N O P L A N S
I N P R E P A R A T I O N
I M P L E M E N T E D 9 7 5
12
18
The adoption of learning analytics (interviews)
Institution-wide Partial/ Pilots Data exploration/cleaning
http://sheilaproject.eu/
LA adoption in Europe
• Institutional survey: 22 countries
NO P LA NS
IN P RE P A RA TION
IMP LE ME NT ED 2 13
15
16
The adoption of LA
Institution-wide Small scale N/A
http://sheilaproject.eu/
LA strategy
http://sheilaproject.eu/
No defined strategy
LA
Digitalisation strategies
Teaching & learning strategies
Immature
plans for
monitoring &
evaluation
Stakeholders
Interests and concerns
http://sheilaproject.eu/
Interests – senior managers
• To improve student learning
performance (16%)
• To improve student satisfaction
(13%)
• To improve teaching excellence
(13%)
• To improve student retention
(11%)
• To explore what learning
analytics can do for our
institution/ staff/ students (10%)
http://sheilaproject.eu/
LA
Learner
driver
Teaching
driver
Institutional
driver
Interests – teaching staff
• An overview of student attendance, submission of
assignments, access to coursework and resources,
and performance.
• Inform course design.
• Manage a big class.
• Know ‘why’ students struggle.
http://sheilaproject.eu/
Interests – students
Personalised approach
• Inform teaching support and curriculum design.
• Support a widening access policy.
• Support students at all achievement levels to
improve learning.
• Assist with transitions from pre-tertiary education
to higher education, and from higher education to
employment.
http://sheilaproject.eu/
Concerns – senior managers
• No one-size-fits-all solutions
• Pressure to adopt LA
• How can the institution as a whole benefit from LA?
• The strictness of existing data protection
regulations makes adoption more difficult.
http://sheilaproject.eu/
Concerns– teaching staff
• Workload
• Judging staff performance
• Not all learning is digital
• No one size-fits-all solution
• Correlation does not suggest causation
• Surveillance on students
http://sheilaproject.eu/
Concerns– students
• Data collection is unnecessarily personal
• Data producing stereotypes and biases
• Limitations in quantifying learning
• Worries about human contacts and teaching
professionalism being replaced by machines
http://sheilaproject.eu/
Concerns– students
Legitimate or illegitimate?
• Purpose
• Anonymity
• Access
http://sheilaproject.eu/
Privacy
paradox
Transparency
Effective
communication
LA policy framework
From ROMA to SHEILA
http://sheilaproject.eu/
ROMA (Rapid Outcome Mapping Approach)
Macfadyen, L., Dawson, S., Pardo, A., Gašević, D., (2014). The learning analytics imperative and the
sociotechnical challenge: Policy for complex systems. Research & Practice in Assessment, 9(Winter 2014), 17-28.
http://sheilaproject.eu/
SHEILA framework
http://sheilaproject.eu/
• Become an associate partner of the
SHEILA project?
• Visit: http://sheilaproject.eu/
Yi-Shan Tsai
yi-shan.tsai@ed.ac.uk
@yi_shan_tsai
http://sheilaproject.eu/

Supporting Higher Education to Integrate Learning Analytics_EUNIS20171107

  • 1.
    Supporting Higher Educationto Integrate Learning Analytics http://sheilaproject.eu/ Yi-Shan Tsai University of Edinburgh yi-shan.tsai@ed.ac.uk @yi_shan_tsai EUNIS Workshop 7th November 2017
  • 2.
  • 3.
    Supporting Higher Educationto Integrate Learning Analytics • The state of the art • Direct engagement with key stakeholders • A comprehensive policy framework http://sheilaproject.eu/
  • 4.
    Slide credit: DraganGašević (2017) Let’s get there! Towards policy for adoption of learning analytics. LSAC, Amsterdam, The Netherlands. http://sheilaproject.eu/
  • 5.
    The state ofthe art Challenges, adoption and strategy http://sheilaproject.eu/
  • 6.
    Adoption challenges 1. Leadershipfor strategic implementation & monitoring 2. Equal engagement with stakeholders 3. Pedagogy-based approaches to removing learning barriers 4. Training to cultivate data literacy among primary stakeholders 5. Evidence of impact 6. Context-based policies to address privacy & ethics issues and other challenges Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education –challenges and policies: a review of eight learning analytics policies. InProceedings of the Seventh International Learning Analytics & Knowledge Conference(pp. 233-242). http://sheilaproject.eu/
  • 7.
    Essential features ofa LA policy… http://sheilaproject.eu/ CHALLENGES Experts’ perspectives
  • 8.
    LA adoption inEurope • Institutional interviews: 16 countries, 51 HEIs, 64 interviews, 78 participants N O P L A N S I N P R E P A R A T I O N I M P L E M E N T E D 9 7 5 12 18 The adoption of learning analytics (interviews) Institution-wide Partial/ Pilots Data exploration/cleaning http://sheilaproject.eu/
  • 9.
    LA adoption inEurope • Institutional survey: 22 countries NO P LA NS IN P RE P A RA TION IMP LE ME NT ED 2 13 15 16 The adoption of LA Institution-wide Small scale N/A http://sheilaproject.eu/
  • 10.
    LA strategy http://sheilaproject.eu/ No definedstrategy LA Digitalisation strategies Teaching & learning strategies Immature plans for monitoring & evaluation
  • 11.
  • 12.
    Interests – seniormanagers • To improve student learning performance (16%) • To improve student satisfaction (13%) • To improve teaching excellence (13%) • To improve student retention (11%) • To explore what learning analytics can do for our institution/ staff/ students (10%) http://sheilaproject.eu/ LA Learner driver Teaching driver Institutional driver
  • 13.
    Interests – teachingstaff • An overview of student attendance, submission of assignments, access to coursework and resources, and performance. • Inform course design. • Manage a big class. • Know ‘why’ students struggle. http://sheilaproject.eu/
  • 14.
    Interests – students Personalisedapproach • Inform teaching support and curriculum design. • Support a widening access policy. • Support students at all achievement levels to improve learning. • Assist with transitions from pre-tertiary education to higher education, and from higher education to employment. http://sheilaproject.eu/
  • 15.
    Concerns – seniormanagers • No one-size-fits-all solutions • Pressure to adopt LA • How can the institution as a whole benefit from LA? • The strictness of existing data protection regulations makes adoption more difficult. http://sheilaproject.eu/
  • 16.
    Concerns– teaching staff •Workload • Judging staff performance • Not all learning is digital • No one size-fits-all solution • Correlation does not suggest causation • Surveillance on students http://sheilaproject.eu/
  • 17.
    Concerns– students • Datacollection is unnecessarily personal • Data producing stereotypes and biases • Limitations in quantifying learning • Worries about human contacts and teaching professionalism being replaced by machines http://sheilaproject.eu/
  • 18.
    Concerns– students Legitimate orillegitimate? • Purpose • Anonymity • Access http://sheilaproject.eu/ Privacy paradox Transparency Effective communication
  • 19.
    LA policy framework FromROMA to SHEILA http://sheilaproject.eu/
  • 20.
    ROMA (Rapid OutcomeMapping Approach) Macfadyen, L., Dawson, S., Pardo, A., Gašević, D., (2014). The learning analytics imperative and the sociotechnical challenge: Policy for complex systems. Research & Practice in Assessment, 9(Winter 2014), 17-28. http://sheilaproject.eu/
  • 21.
  • 22.
    • Become anassociate partner of the SHEILA project? • Visit: http://sheilaproject.eu/ Yi-Shan Tsai yi-shan.tsai@ed.ac.uk @yi_shan_tsai http://sheilaproject.eu/

Editor's Notes

  • #3 Partner organisations: The University of Edinburgh, UK Universidad Carlos III de Madrid, Spain Open University of the Netherlands, Netherlands Tallinn University, Estonia Erasmus Student Network aisbl (ESN), international European Association for Quality Assurance in Higher Education, international Brussels Educational Services, international
  • #4 SHEILA aims to support the adoption of LA in higher education. To do so, we have 3 clear objectives: 1. understanding the state of the art in Europe. 2. Direct engagement with key stakeholders; 3. policy development
  • #7 Challenge 5 has also been identified in Ferguson, R., & Clow, D. (2017). Where is the evidence? A call to action for learning analytics.
  • #8 The importance scale suggests priorities: (1) privacy & ethics (safeguard); (2) management and goals; (3) data management & analysis The rating results of the these statements show an obvious drop of rating scale in the ‘ease of implementation’ level of these themes, compared to their ‘importance’ level. One of the implications is that the six features could potentially be challenges to deal with in order to scale up the adoption of LA. It’s also interesting to see that privacy & transparency relevant actions are considered the easiest to implement.
  • #9 16 countries – UK (21), Spain (11), Estonia (3), Ireland (2), Italy (2), Portugal (2), Austria (1), Croatia (1), Czech Republic (1), Finland (1), France (1), Latvia (1), Netherlands (1), Norway (1), Romania (1), and Switzerland (1) 21 out of 51 institutions were already implementing centrally-supported learning analytics projects. 25 institutions have established formal working groups, but not all institutions have planned to provide analytics data to students.
  • #10 22 countries: Austria, Bulgaria, Cyprus , Czech Republic, Denmark, Estonia, Finland, Germany, Hungary, Ireland, Italy, Lithuania, Netherlands, Norway, Portugal, Romania, Serbia, Slovakia, Spain, Switzerland, Turkey, UK Interview + survey: 26 countries
  • #11 In many cases where LA was supported centrally, LA was usually initiated under the wider digitalisation strategies or teaching and learning strategies. However, there were also a great number of institutions that had not defined clear strategies for learning analytics and were still at the ‘experimental’ or ‘exploratory’ stage.
  • #13 The interviews identified three common aspects of internal drivers for the adoption of learning analytics: Learner-driver: to encourage students taking responsibility for their own studies by providing data-based information or guidance. Teaching-driver: to identify learning problems, improve teaching delivery, and allow timely, evidence-based support. Institution-driver: to inform strategic plans, manage resources, and improve institutional performances, such as retention rate and student satisfaction. An equivalent question (multiple choices) in the survey provided 11 options for motivations specific to learning and teaching. The results identified five top drivers.
  • #14 Student engagement data: when, how long, etc. Inform course design: reflect on places where students fail. Know ‘why’ students struggle: it’s not good enough to just know that students fail certain questions.
  • #15 Inform teaching support and curriculum design so that no one is falling behind or having to learn the same materials repetitively. Support a widening access policy – at a class level. Support students at all achievement levels to improve learning by providing them a better overview of their own learning progress.
  • #16 No one-size-fits-all solutions: Needs vary by institutions, but existing solutions focus on addressing retention problems. differences among subjects and faculties. Uncertainly about the benefits of LA: fear of failing expectations
  • #17 Correlation does not suggest causation: e.g., engaging in discussion forums does not necessarily prevent students from failing, even though data may suggest a correlation between forum engagement and learning success.
  • #18 Correlation does not suggest causation: e.g., engaging in discussion forums does not necessarily prevent students from failing, even though data may suggest a correlation between forum engagement and learning success.
  • #19 GDPR, Article 6, “Lawfulness of processing”, counters Article 7 by allowing institutions to process personal data when such data is necessary for the purpose of ‘legitimate interests’, or are necessary to carry out tasks that are of ‘public interest’. Three purposes: to comply with legal requirements, such as visas; to improve educational services, such as learning support, teaching delivery, career development, educational resources management, and the support of student well-being; to improve the overall performance of the university, such as league rankings, equality, and the recruitment of future students. Anonymity: okay with personal tutors. Not okay with tutors who may be involved in marking student performance. Access: extreme distrust in 3rd parties for the fear of becoming marketing targets. Although the participants had strong views about protecting their privacy and expectations about how their data should be used, they felt that they had sufficient understanding about the existing data practice to critically question its legitimacy – privacy paradox. The privacy paradox phenomenon suggests that institutions need to scale up their transparency and effective communication with students.
  • #21 The ROMA model was originally designed by to support policy and strategy processes in the field of international development. The model begins with defining an overarching policy objective, followed by six steps designed to provide policy makers with context-based information. It allows decision makers to identify key factors that enable or impede the implementation of learning analytics. Moreover, the reflective process allows refinement and adaptation of policy goals to meet context change over time.