Slide of my keynote talk at SIAIA '23 workshop held at AAAI 2023:
The use of AI in Education could be traced to the early days of AI. While the publicity associated with the most recent wave of AI applications rarely mentions education, it is through the improvement in education AI could achieve an impressive social impact. In particular, the AI ability to personalize the learning process could make a large difference in a context where learners' knowledge could be radically different from learner to learner. Modern computer and internet technologies can now bring the power of learning in the forms of MOOCs, online textbooks, and zoom courses truly worldwide. Yet, without personalization, the potential of these technologies is not fully leveraged. In this talk, I will review several generations of research on personalized learning and discuss tools, technologies, and infrastructures for personalized learning that we are currently exploring.
Personalized Learning: Expanding the Social Impact of AI
1. Personalized Learning:
Expanding the Social
Impact of AI
Peter Brusilovsky with:
Sergey Sosnovsky, Michael Yudelson, Sharon Hsiao,
Julio Guerra, Yun Huang, Roya Hosseini, Jordan
Barria-Pineda, Kamil Akhuseyinoglu
School of Computing and Information,
University of Pittsburgh
9. Best Use of Personalization?
• Learning results from what the student does and
thinks, and only from what the student does and
thinks. The teacher can advance learning only by
influencing what the student does to learn.
• Herbert A. Simon (1916–2001)
10. • Online learning system
should engage student
in meaningful learning
activities
Image credit: http://merchandisingblog.inspire.ca/find-the-hidden-treasure/
People Learn through Activities
11. • Each student need
different activities (kind,
amount, order)
• A personalized learning
system could use data
about each student
(knowledge, goals, …) to
guide them to the next
most relevant activity
People Learn Differently
13. MOOC Completion Rate
Classic loop user modeling - adaptation in adaptive systems
http://www.katyjordan.com/MOOCproject.html
14. • Assessment-based
• Same for all
• Not enough doing
• Weak feedback loop
• High threshold
– From reading to
complex problems
– Many are not ready
Labs/Homework Don’t Cover the Need
15. Personalized for CS Education
• Why Personalized Practice?
– Everyone can work as much as necessary
– Everyone can focus on topics and concepts where the
knowledge is weakest
– Low-stake might help to prevent cheating
• Why CS Education?
– One of the hot topics – huge number of students
going in
– The area where starting knowledge and speed of
learning could radically differ.
16. CS Education: Interactive Tools
• Improving homework
– Better IDEs, Autograding, Extended feedback
• Beyond homework: “smart” practice content
– Program visualization (i.e., Python Tutor)
– Practice problems (i.e, Coding Bat)
– Worked examples (i.e., WebEx)
18. The Problem of Engagement
• Great free content and top teachers are not
enough to engage students
• Peter Norvig: Motivation and engagement are
key problems for MOOCs
• A lot of great practice content
– Works perfectly in lab studies, great gains
– Released to students free use to enhance learning
– No impact – students do not use it
19. Recipes for Personalized Engaging Practice
• Adaptive navigation support
• Open learner modeling
• Social comparison
• Knowledge/opportunity visualization
• Content recommendation
– Proactive
– Remedial
– Explainable
20. QuizPACK: Code Tracing Exercises
• QuizPACK: Quizzes for
Parameterized Assessment of
C Knowledge
• Each question is a pattern of a
simple C program. When it is
delivered to a student the
special parameter is
dynamically instantiated by a
random value within the pre-
assigned borders.
• Used mostly as a self-
assessment tool in two C-
programming courses
21. QuizPACK: Value and Problems
• Good news:
– activity with QuizPACK significantly correlated with
student performance in classroom quizzes
– Knowledge gain rose from 1.94 to 5.37
• But:
– Low success rate - below 40%
– The system is under-used (used less than it deserves)
• Less than 10 sessions at average
• Average Course Coverage below 40%
– We need personalization and engagement!
22. Engaging Known Recepies: OLM + ANS
• Open Learner Modeling
– Increases motivations
– Support self-organized learning
• Adaptive navigation support
– Lower navigation overhead
• Access the content at the right time
• Find relevant information faster
– Better learning outcomes
23. Questions of
the current
quiz, served
by QuizPACK
List of annotated
links to all quizzes
available for a
student in the
current course
Refresh
and help
icons
QuizGuide = QuizPACK+ANS
24. QuizGuide: OLM+ANS
• Target-arrow abstraction:
– Number of arrows – level of
knowledge for the specific
topic (from 0 to 3).
Individual, event-based
adaptation.
– Color Intensity – learning
goal (current, prerequisite
for current, not-relevant,
not-ready). Group, time-
based adaptation.
n Topic–quiz organization:
25. QuizGuide: Success Rate
n It works!
n Mean success value for
QuizGuide is significantly
larger then the one for
QuizPACK:
F(1, 43) = 5.07
(p-value = 0.03).
26. QuizGuide: Motivation
• Adaptive navigation support increased student's
activity and persistence of using the system
Average activity
0
50
100
150
200
250
300
2002 2003 2004
Average num. of
sessions
0
5
10
15
20
2002 2003 2004
Average course
coverage
0%
10%
20%
30%
40%
50%
60%
2002 2003 2004
Active students
0%
20%
40%
60%
80%
100%
2002 2003 2004
n Within the same class QuizGuide session were much longer than
QuizPACK sessions: 24 vs. 14 question attempts at average.
n Average Knowledge Gain for the class rose from 5.1 to 6.5
27. Checking in Another Domains…
• Is it something relevant to C programming or to
special kind of content?
• Near transfer
– Java instead of C, complex problems
• Far transfer
– SQL Programming instead of C
– Programming problems (code writing) instead of
questions (code evaluation)
– Give students a chance to choose how to access
29. • To investigate possible influence of concept-based
adaptation in the present of topic-based adaptation we
developed two versions of QuizGuide:
Topic-based Topic-based+Concept-Based
SQL-Guide
30. • Total number of attempts made by all students:
in adaptive mode (4081), in non-adaptive mode (1218)
• Students in general were much more willing to access
the adaptive version of the system, explored more
content with it and to stayed with it longer:
Questions
0
25
50
75
100
Quizzes
0
5
10
15
20
25
Topics
0
1
2
3
4
5
6
Sessions
0
1
2
3
4
5 Session Length
0
5
10
15
20
25
Adaptive
Non-adaptive
Confirmed… and Students Prefer It
31. Social Comparison and Navigation
• OLM and adaptive navigation support work well to
increase success and motivation
• Knowledge-based approaches require some
knowledge engineering – concept/topic models,
prerequisites, time schedule
• In our past work we learned that social navigation –
guidance extracted from the work of a community of
learners – might replace knowledge-based guidance
• Social wisdom vs. knowledge engineering
32. Open Social Learner Modeling
• Key ideas
– Show topic- and content- level knowledge progress of
a student in contrast to the same progress of the class
– Use social comparison to engage and guide students
• Main challenge
– How to design the interface to show student and class
progress over topics?
– We went through several attempts
36. Class vs. Peers
• Peer progress was important, students
frequently accessed content using peer models
• The more the students compared to their peers,
the higher post-quiz scores they received (r=
0.34 p=0.004)
• Parallel IV didn’t allow to recognized good peers
before opening the model
• Progressor added clear peer progress
comparison
37. Progressor+ OSLM for two types of content
• macro- and micro- comparisons (group or peers)
40
38. Students Spent More Time in Progressor+
Quiz =: 5 hours
Example : 5 hours 20 mins
41
60.04
150.19
224.7
296.9
69.52
121.23
110.66
321.1
0
50
100
150
200
250
300
350
400
QuizJET JavaGuide Progressor Progressor+
Total time spent (minutes)
Quiz
Example
40. Mastery Grids: Personalized
Practice System with OSLM
Loboda, T. D., Guerra, J., Hosseini, R., & Brusilovsky, P. (2014, September). Mastery grids: An open source social
educational progress visualization. In European conference on technology enhanced learning (pp. 235-248). Springer,
Cham.
Learning content
OSLM Features
41. MasteryGrids
• Adaptive Navigation Support
• Topic-based Adaptation
• Open Social Learner Modeling
• Social Educational Progress Visualization
• Multiple Content Types
• Concept-Based Recommendation
• Open Source
43. Topic-Level vs. Concept-level OLM
46
Guerra Hollstein, J., Barria Pineda, J., Schunn, C., Bull, S., and Brusilovsky, P.
(2017) Fine-Grained Open Learner Models: Complexity Versus Support. In: Proceedings
of Proceedings of the 25th Conference on User Modeling, Adaptation and Personalization,
Bratislava, Slovakia, ACM, pp. 41-49.
44. Impact on Learning
• Student knowledge significantly increased in both
groups
• Number of attempted problems significantly
predicts the final grade (SE=0.04,p=.017).
• We obtained the coefficient of 0.09 for number of
attempts on problems, meaning attempting 100
problems increases the final grade by 9
• The mean learning gain was higher for both weak and
strong students in OSSM group
• The difference was significant for weak students
(p=.033)
45. Personalized Visual Support for
Activity Selection with Rich-OLM
Guerra, J., C. Schunn, S. Bull, J.
Barria-Pineda and P. Brusilovsky
(2018). Navigation support in
complex open learner models:
assessing visual design
alternatives. New Review of
Hypermedia and
Multimedia 24(3): 160-192.
46. Mousing over this
activity
Concepts in the selected
activity are highlighted
This gauge estimates the
how much you can learn
in the selected activity.
You will probably learn
more in activities that
have more new concepts
Guerra-Hollstein, J., Barria-Pineda, J., Schunn, C., Bull, S.,
and Brusilovsky, P. (2017) Fine-Grained Open Learner
Models: Complexity Versus Support. In: Proceedings of the
25th Conference on User Modeling, Adaptation and
Personalization, Bratislava, Slovakia, ACM, pp. 41-49.
47. ATEC Workshop
2 0 1 9
Los Angeles
50
Explanations with OLM
Barria-Pineda,
Jordan,
and
Peter
Brusilovsky.
2019.
"Explaining
Educational
Recommendations
Through
a
Concept-level
Knowledge
Visualization."
In
Proceedings
of
the
24th
International
Conference
on
Intelligent
User
Interfaces:
Companion,
103--04.
New
York,
NY,
USA:
ACM.
48. Adding Direct Recommendations
• Stronger guidance than Adaptive
Navigation Support
• Proactive recommendations
– Expand knowledge
• Remedial recommendation
– Address problems
• Explanations
– Visual explanations with OLM
– Text-based explanations
52. Remedial Recommendations
Remedial visual
explanations
Related concepts highlighted
Knowledge estimates as bar-chart
Recent success rate as bar-color
Warning sign on “struggled”
concepts
57
Barria-Pineda, J., Akhuseyinoglu, K., and Brusilovsky, P. (2019) Explaining Need-based Educational
Recommendations Using Interactive Open Learner Models. Proceedings of International Workshop on
Transparent Personalization Methods based on Heterogeneous Personal Data, ExHUM at the 27th
ACM Conference On User Modelling, Adaptation And Personalization, UMAP '19, Larnaca, Cyprus
54. What we are doing now?
• Connecting mandatory work (labs and
homeworks) with free practice
• Being able to see your knowledge and your
target, i.e., lab, exam, homework, you could get
ready for the challenge
• If you have troubles in your lab, the system
records it (as well as success)
• You could run a remedial adaptive practice after
the lab.
55. Acknowledgements
• Joint work with
– Sergey Sosnovsky, Michael Yudelson
– Rosta Farzan, Sharon Hsiao, Tomek Loboda
– Yun Huang, Julio Guerra, Roya Hosseini
– Jordan Barria, Kamil Akhuseyinoglu
• U. of Pittsburgh “Innovation in Education” awards
• NSF Grants
– CAREER 0447083
– DUE 0310576
– DUE 0633494
– IIS 0426021
– DLR 1740775
• ADL.net support for OSLM work