Addictive Links:
Adaptive Navigation Support
for College-Level Courses
Peter Brusilovsky with:
Sergey Sosnovsky, Michael Yudelson, Sharon Hsiao
School of Information Sciences,
University of Pittsburgh
User Model
Collects information
about individual user
Provides
adaptation effect
Adaptive
System
User Modeling side
Adaptation side
User-Adaptive Systems
Classic loop user modeling - adaptation in adaptive systems
Adaptive Software Systems
•  Intelligent Tutoring Systems
–  adaptive course sequencing
–  adaptive . . .
•  Adaptive Hypermedia Systems
–  adaptive presentation
–  adaptive navigation support
•  Adaptive Help Systems
•  Adaptive . . .
Adaptive Hypermedia
•  Hypermedia systems = Pages + Links
•  Adaptive presentation
–  content adaptation
•  Adaptive navigation support
–  link adaptation
•  Direct guidance
•  Hiding, restricting, disabling
•  Generation
•  Ordering
•  Annotation
•  Map adaptation
Adaptive Navigation Support
Adaptive Link Annotation
Adaptive Link Annotation: InterBook
 
1. Concept role
2. Current concept state
3. Current section state
4. Linked sections state
4
3
2
1
√"
Metadata-based mechanism
Personalized Information Access
Adaptive
Hypermedia	

Adaptive
IR	

Web
Recommenders	

Navigation Search Recommendation
Metadata-based
mechanism
Keyword-based
mechanism
Community-based
mechanism
Adaptation Mechanisms
Adaptive Link Annotation: ScentTrails
Content-basedmechanism
Adaptive Link Annotation: CoWebSocialnavigationmechanism
The Value of ANS
•  Lower navigation overhead
–  Access the content at the right time
–  Find relevant information faster
•  Better learning outcomes
–  Achieve the same level of knowledge faster
–  Better results with fixed time
•  Encourages non-sequential navigation
The Case of QuizPACK
•  QuizPACK: Quizzes for
Parameterized Assessment of
C Knowledge
•  Each question is a pattern of a
simple C program. When it is
delivered to a student the
special parameter is
dynamically instantiated by a
random value within the pre-
assigned borders.
•  Used mostly as a self-
assessment tool in two C-
programming courses
QuizPACK: Value and Problems
•  Good news:
–  activity with QuizPACK significantly correlated with
student performance in classroom quizzes
–  Knowledge gain rose from 1.94 to 5.37
•  But:
–  Low success rate - below 40%
–  The system is under-used (used less than it deserves)
•  Less than 10 sessions at average
•  Average Course Coverage below 40%
Adding Motivation (2003)
•  Students need some better motivation to work with non-
mandatory educational content…
•  Added classroom quizzes:
–  Five randomly initialized questions out of 20-30 questions
assigned each week
•  Good results - activity, percentage of active questions,
course coverage - all increased 2-3 times! But still not as
much as we want. Could we do better?
•  Maybe students bump into wrong questions? Too easy?
Too complicated? Discouraging…
•  Let’s try something that worked in the past
Questions of
the current
quiz, served
by QuizPACK
List of annotated
links to all quizzes
available for a
student in the
current course
Refresh
and help
icons
QuizGuide = QuizPACK+ANS
Demo: QuizPack
•  KT Portal - http://adapt2.sis.pitt.edu/cbum/
•  Create your account or use test / test
Topic-Based Adaptation
Concept
A
Concept
B
Concept
C
n  Each topic is associated with a number of
educational activities to learn about this topic
n  Each activity classified under 1 topic
QuizGuide: Adaptive Annotations
•  Target-arrow abstraction:
–  Number of arrows – level of
knowledge for the specific
topic (from 0 to 3).
Individual, event-based
adaptation.
–  Color Intensity – learning
goal (current, prerequisite
for current, not-relevant,
not-ready). Group, time-
based adaptation.
n  Topic–quiz organization:
QuizGuide: Architecture
User
Model
QuizGuide
QuizPACK
Student's
Browser
QuizGuide: Motivation
•  Adaptive navigation support increased student's
activity and persistence of using the system
Average activity
0
50
100
150
200
250
300
2002 2003 2004
Average num. of
sessions
0
5
10
15
20
2002 2003 2004
Average course
coverage
0%
10%
20%
30%
40%
50%
60%
2002 2003 2004
Active students
0%
20%
40%
60%
80%
100%
2002 2003 2004
n  Within the same class QuizGuide session were much longer than
QuizPACK sessions: 24 vs. 14 question attempts at average.
n  Average Knowledge Gain for the class rose from 5.1 to 6.5
QuizGuide: Success Rate?
n Well, that worked too, but
the scale is not comparable
n One-way ANOVA shows
that mean success value for
QuizGuide is significantly
larger then the one for
QuizPACK:
F(1, 43) = 5.07
(p-value = 0.03).
A new value of ANS?
•  The scale of the effect is too large…
May be just a good luck?
•  New effect after 15 years of research?
•  Not quite new, rather ignored and
forgotten - ELM-ART data
ELM-ART: Navigation Support
Round 2: Let’s Try it Again…
•  Maybe the effect could only be discovered in
full-scale classroom studies – while past studies
were lab-based?
•  Another study with the same system
–  QuizGuide+QuizPACK vs. QuizPACK
•  A study with another system using similar kinds
of adaptive navigation support
–  NavEx+WebEx vs. WebEx
•  NavEx - a value-added ANS front-end for
WebEx - interactive example exploration system
WebEx - Code Examples
Concept-based student modeling
Example 2	

 Example M	

Example 1	

Problem 1	

Problem 2	

 Problem K	

Concept 1	

Concept 2	

Concept 3	

Concept 4	

Concept 5	

Concept N	

Examples
Problems
Concepts
NavEx = WebEx + ANS
Does it work?
•  The increase of the amount of work for the
course
Clicks - Overall
0
50
100
150
200
250
300
Non-adaptive Adaptive
Examples
Quizzes
Lectures - Overall
0
2
4
6
8
10
12
Non-adaptive Adaptive
Examples
Quizzes
Learning Objects - Overall
0
5
10
15
20
25
30
Non-adaptive Adaptive
Examples
Quizzes
Is It Really Addictive?
•  Are they coming more often? Mostly, but there
is no stable effect
•  But when they come, they stay… like with an
addictive game
Clicks - Per Session
0
5
10
15
20
Non-adaptive Adaptive
Examples
Quizzes
Learning Objects - Per
Session
0
1
2
3
4
Non-adaptive Adaptive
Examples
Quizzes
Why It Is Working?
•  Progress-based annotation
–  Displays the progress achieved so far
–  Does it work as a reward mechanism?
–  Open Student Modeling
•  State-based annotation
–  Not useful, ready, not ready
–  Access activities in the right time
–  Appropriate difficulty, keep motivation
A Deeper Look
The Diversity of Work
•  C-Ratio: Measures the breadth of exploration
•  Goal distance: Measures the depth
Self-motivated Work - C-Ratio
(%)
0
0.2
0.4
0.6
Non-adaptive Adaptive
Quizzes
Examples
Self-motivated Work - Goal
Distance (LO's)
0
5
10
15
20
Non-adaptive Adaptive
Quizzes
Examples
Round 3: Trying another domain…
•  Is it something relevant to C programming or to
simple kind of content?
•  New changes:
–  SQL Programming instead of C
–  Programming problems (code writing) instead of
questions (code evaluation)
–  Comparison of concept-based and topic-based
mechanisms in the same domain and with the same
kind of content
•  SQL-KnoT delivers online SQL problems, checks student’s
answers and provides a corrective feedback
•  Every problem is dynamically generated using a template
and a set of
databases
•  All problems have
been assigned to 1
of the course
topics and
indexed with
concepts from the
SQL ontology
SQL Knowledge Tester
•  To investigate possible influence of concept-based
adaptation in the present of topic-based adaptation we
developed two versions of QuizGuide:
Topic-based Topic-based+Concept-Based
Concept-based vs Topic-based ANS
•  Two Database Courses (Fall 2007):
§  Undergraduate (36 students)
§  Graduate (38 students)
•  Each course divided into two groups:
§  Topic-based navigation
§  Topic-based + Concept-Based Navigation
•  All students had access to the same set of SQL-
KnoT problems available in adaptive
(QuizGuide) and in non-adaptive mode (Portal)
Study Design
•  Total number of attempts made by all students:
in adaptive mode (4081), in non-adaptive mode (1218)
•  Students in general were much more willing to access
the adaptive version of the system, explored more
content with it and to stayed with it longer:
Questions
0
25
50
75
100
Quizzes
0
5
10
15
20
25
Topics
0
1
2
3
4
5
6
Sessions
0
1
2
3
4
5 Session Length
0
5
10
15
20
25
Adaptive
Non-adaptive
It works again! Like magic…
•  Did concept-based adaptation increase the
magnitude of the motivational effect?
§  No significant difference in the average numbers of attempts,
problems answered, topics explored
§  No significant difference in the session length
•  Was there any other observable difference?
§  Average number of attempts per question
§  Resulting Knowledge Level
(averaged across all concepts)
Attempts per
question
0
2
4
6
Knowledge level
0
0.2
0.4
0.6
Combined
Topic-based
Concept-based ANS: Added Value?
•  Question-based Patterns:
•  Topic-based Patterns:
Repetition 0:
incorrect previous
attempt
Repetition 1:
correct previous
attempt
Sequence RepetitionGo-Back Skipping
Next Topic Jump-Forward Jump-Backward Combined
Pattern Analysis
Topic Pattern Distribution
0%
10%
20%
30%
40%
50%
60%
70%
80%
Next-Topic Jump-Fwrd Jump-Bkwd
Combined (undegraduate)
Combined (graduate)
Topic-based (undegraduate)
Topic-based (graduate)
Pattern Analysis (2)
Question Pattern Distribution
0%
10%
20%
30%
40%
50%
60%
Sequence Go-Back Skipping Repetition-0 Repetition-1
Combined (undegraduate)
Combined (graduate)
Topic-based (undegraduate)
Topic-based (graduate)
Pattern Analysis (3)
•  Difference in the ratio of Repetition1 pattern
explains:
§  difference in the average number of attempts per question
§  difference in the cumulative resulting knowledge level
§  Students repeat the same question again and
again:
§  They “get addicted“ to the concept-based icons
§  Is it a good thing for us?
− YES – they react to the navigational cues, they work more
− NO – we expect them to concentrate on those questions where they have
smaller progress instead of drilling in the same question
Discussion
Round 4: The Issue of Complexity	
•  Let’s now try it for Java…
•  What is the research goal?
•  Java is a more sophisticated domain than C
–  OOP versus Procedural
–  Higher complexity
•  Will it work for complex
questions?
•  Will it work similarly? 0% 20% 40% 60% 80% 100%
C
Java
language complexity
Easy
Moderate
Hard
Meet QuizJET!
Navigaon	
  Area	
 Presentaon	
  Area	
JavaGuide
!! !!
JavaGuide
(Fall 2008)
QuizJET
(Spring 2008)
!! parameters (n=22) (n=31)
Overall User
Statistics
Attempts 125.50 41.71
Success Rate 58.31% 42.63%
Distinct Topics 11.77 4.94
Distinct Questions 46.18 17.23
Average
User Session
Statistics
Attempts 30.34 21.50
Distinct Topics 2.85 2.55
Distinct Questions 11.16 8.88
Magic… Here We Go Again!
•  Significantly more Attempts
on the easy questions in
JavaGuide than in QuizJET
•  F(1, 153) = 7.081, p = .009, partial η2 = .044
The Effect Depends on Complexity
•  Significant higher
Success Rate
•  F(1, 153) = 40.593, p .001, partial η2 = .210
•  On average 2.5 times
more likely to answer a
quiz correctly with
adaptive navigation
support
68.73%
67.00%
39.32%38.00%
28.23%
11.90%
0%
20%
40%
60%
80%
100%
Easy Moderate Hard
SuccessRate
Complexity Level
Success Rate
JavaGuide
QuizJET
Different Pattern For Success Rate
0.94
0.61
0.29
1.85
1.01
0.44
0
0.5
1
1.5
2
Easy Moderate Hard
Attempts/Question
Complexity Level
Average Attempts per
Question
68.73%
67.00%
39.32%38.00%
28.23%
11.90%
0%
20%
40%
60%
80%
100%
Easy Moderate Hard
SuccessRate
Complexity Level
Success Rate
JavaGuide
QuizJET
Prerequisite-based guidance prepared students to attempt complex questions after exploring easier ones
Putting it Together…
Round 5: Social Navigation
•  Concept-based and topic-based navigation support
work well to increase success and motivation
•  Knowledge-based approaches require some
knowledge engineering – concept/topic models,
prerequisites, time schedule
•  In our past work we learned that social navigation –
guidance extracted from the work of a community of
learners – might replace knowledge-based guidance
•  Social wisdom vs. knowledge engineering
Open Social Student Modeling
•  Key ideas
–  Assume simple topic-based design
–  No prerequsites or concept modeling
–  Show topic- and content- level knowledge progress of
a student in contrast to the same progress of the class
•  Main challenge
–  How to design the interface to show student and class
progress over topics?
–  We went through several attempts
QuizMap
53
Parallel Introspective Views
54
0
40
80
120
160
QuizJET+IV QuizJET+Portal JavaGuide
Attempts
Attempts
Results: Progress
Class vs. Peers
•  Peer progress was important, students
frequently accessed content using peer models
•  The more the students compared to their peers,
the higher post-quiz scores they received (r=
0.34 p=0.004)
•  Parallel IV didn’t allow to recognized good peers
before opening the model
•  Progressor added clear peer progress
F(1,32)= 11.303, p.01
71.35%
42.63%
58.31%
0.00%
20.00%
40.00%
60.00%
80.00%
100.00%
QuizJET+IV QuizJET+Portal JavaGuide
Success Rate
Success Rate
Results: Success
Why It Is Important?
•  Many systems demonstrated their educational
effectiveness in a lab-like settings: once the students
are pushed to use it - it benefits their learning
•  However, once released to real classes, these systems
are under-used - most of them offer additional non-
mandatory learning opportunities
•  “Students are only interested in points and grades”
•  Convert all tools into credit-bearing activities?
•  Or use alternative approaches to increase motivation
Progressor
59
The Value of Peers
205.73
113.05
80.81
125.5
0
50
100
150
200
250
Attempts
Progressor
QuizJET+IV
QuizJET+Portal
JavaGuide
68.39%
71.35%
42.63%
58.31%
0.00%
20.00%
40.00%
60.00%
80.00%
Success Rate
Progressor
QuizJET+IV
QuizJET+Portal
JavaGuide
The Secret
Take-home messages
•  A combination of progress-based and state-
based adaptive link annotation increases the
amount and the diversity of student work with
non-mandatory educational content
•  The effect is stable and the scale of it is quite
large
•  Properly organized Social Navigation might be
at least as successful as the knowledge-based
•  Requires a long-term classroom study to
observe
What we are doing now?
•  Exploring new generation of open social
modeling tools in wide variety if classes and
domains from US to Nigeria
–  Interested to be a pilot site?
•  Exploring more advanced guidance and
modeling approaches based on large volume of
social data
•  Applying open social modeling to motivate
readings
Acknowledgements
•  Joint work with
–  Sergey Sosnovsky
–  Michael Yudelson
–  Sharon Hsiao
•  Pitt “Innovation in Education” grant
•  NSF Grants
–  EHR 0310576
–  IIS 0426021
–  CAREER 0447083
Try It!
•  http://adapt2.sis.pitt.edu/kt/
•  Brusilovsky, P., Sosnovsky, S., and Yudelson, M. (2009)
Addictive links: The motivational value of adaptive link annotation.
New Review of Hypermedia and Multimedia 15 (1), 97-118.
•  Hsiao, I.-H., Sosnovsky, S., and Brusilovsky, P. (2010) Guiding
students to the right questions: adaptive navigation support in an E-
Learning system for Java programming. Journal of Computer Assisted
Learning 26 (4), 270-283.
•  Hsiao, I.-H., Bakalov, F., Brusilovsky, P., and König-Ries, B. (2013)
Progressor: social navigation support through open social student
modeling. New Review of Hypermedia and Multimedia [PDF]
Read About It!

Addictive links: Adaptive Navigation Support in College-Level Courses

  • 1.
    Addictive Links: Adaptive NavigationSupport for College-Level Courses Peter Brusilovsky with: Sergey Sosnovsky, Michael Yudelson, Sharon Hsiao School of Information Sciences, University of Pittsburgh
  • 2.
    User Model Collects information aboutindividual user Provides adaptation effect Adaptive System User Modeling side Adaptation side User-Adaptive Systems Classic loop user modeling - adaptation in adaptive systems
  • 3.
    Adaptive Software Systems • Intelligent Tutoring Systems –  adaptive course sequencing –  adaptive . . . •  Adaptive Hypermedia Systems –  adaptive presentation –  adaptive navigation support •  Adaptive Help Systems •  Adaptive . . .
  • 4.
    Adaptive Hypermedia •  Hypermediasystems = Pages + Links •  Adaptive presentation –  content adaptation •  Adaptive navigation support –  link adaptation
  • 5.
    •  Direct guidance • Hiding, restricting, disabling •  Generation •  Ordering •  Annotation •  Map adaptation Adaptive Navigation Support
  • 6.
  • 7.
    Adaptive Link Annotation:InterBook   1. Concept role 2. Current concept state 3. Current section state 4. Linked sections state 4 3 2 1 √" Metadata-based mechanism
  • 8.
    Personalized Information Access Adaptive Hypermedia Adaptive IR Web Recommenders NavigationSearch Recommendation Metadata-based mechanism Keyword-based mechanism Community-based mechanism Adaptation Mechanisms
  • 9.
    Adaptive Link Annotation:ScentTrails Content-basedmechanism
  • 10.
    Adaptive Link Annotation:CoWebSocialnavigationmechanism
  • 11.
    The Value ofANS •  Lower navigation overhead –  Access the content at the right time –  Find relevant information faster •  Better learning outcomes –  Achieve the same level of knowledge faster –  Better results with fixed time •  Encourages non-sequential navigation
  • 12.
    The Case ofQuizPACK •  QuizPACK: Quizzes for Parameterized Assessment of C Knowledge •  Each question is a pattern of a simple C program. When it is delivered to a student the special parameter is dynamically instantiated by a random value within the pre- assigned borders. •  Used mostly as a self- assessment tool in two C- programming courses
  • 13.
    QuizPACK: Value andProblems •  Good news: –  activity with QuizPACK significantly correlated with student performance in classroom quizzes –  Knowledge gain rose from 1.94 to 5.37 •  But: –  Low success rate - below 40% –  The system is under-used (used less than it deserves) •  Less than 10 sessions at average •  Average Course Coverage below 40%
  • 14.
    Adding Motivation (2003) • Students need some better motivation to work with non- mandatory educational content… •  Added classroom quizzes: –  Five randomly initialized questions out of 20-30 questions assigned each week •  Good results - activity, percentage of active questions, course coverage - all increased 2-3 times! But still not as much as we want. Could we do better? •  Maybe students bump into wrong questions? Too easy? Too complicated? Discouraging… •  Let’s try something that worked in the past
  • 15.
    Questions of the current quiz,served by QuizPACK List of annotated links to all quizzes available for a student in the current course Refresh and help icons QuizGuide = QuizPACK+ANS
  • 16.
    Demo: QuizPack •  KTPortal - http://adapt2.sis.pitt.edu/cbum/ •  Create your account or use test / test
  • 17.
    Topic-Based Adaptation Concept A Concept B Concept C n  Eachtopic is associated with a number of educational activities to learn about this topic n  Each activity classified under 1 topic
  • 18.
    QuizGuide: Adaptive Annotations • Target-arrow abstraction: –  Number of arrows – level of knowledge for the specific topic (from 0 to 3). Individual, event-based adaptation. –  Color Intensity – learning goal (current, prerequisite for current, not-relevant, not-ready). Group, time- based adaptation. n  Topic–quiz organization:
  • 19.
  • 20.
    QuizGuide: Motivation •  Adaptivenavigation support increased student's activity and persistence of using the system Average activity 0 50 100 150 200 250 300 2002 2003 2004 Average num. of sessions 0 5 10 15 20 2002 2003 2004 Average course coverage 0% 10% 20% 30% 40% 50% 60% 2002 2003 2004 Active students 0% 20% 40% 60% 80% 100% 2002 2003 2004 n  Within the same class QuizGuide session were much longer than QuizPACK sessions: 24 vs. 14 question attempts at average. n  Average Knowledge Gain for the class rose from 5.1 to 6.5
  • 21.
    QuizGuide: Success Rate? n Well,that worked too, but the scale is not comparable n One-way ANOVA shows that mean success value for QuizGuide is significantly larger then the one for QuizPACK: F(1, 43) = 5.07 (p-value = 0.03).
  • 22.
    A new valueof ANS? •  The scale of the effect is too large… May be just a good luck? •  New effect after 15 years of research? •  Not quite new, rather ignored and forgotten - ELM-ART data
  • 23.
  • 24.
    Round 2: Let’sTry it Again… •  Maybe the effect could only be discovered in full-scale classroom studies – while past studies were lab-based? •  Another study with the same system –  QuizGuide+QuizPACK vs. QuizPACK •  A study with another system using similar kinds of adaptive navigation support –  NavEx+WebEx vs. WebEx •  NavEx - a value-added ANS front-end for WebEx - interactive example exploration system
  • 25.
    WebEx - CodeExamples
  • 26.
    Concept-based student modeling Example2 Example M Example 1 Problem 1 Problem 2 Problem K Concept 1 Concept 2 Concept 3 Concept 4 Concept 5 Concept N Examples Problems Concepts
  • 27.
  • 28.
    Does it work? • The increase of the amount of work for the course Clicks - Overall 0 50 100 150 200 250 300 Non-adaptive Adaptive Examples Quizzes Lectures - Overall 0 2 4 6 8 10 12 Non-adaptive Adaptive Examples Quizzes Learning Objects - Overall 0 5 10 15 20 25 30 Non-adaptive Adaptive Examples Quizzes
  • 29.
    Is It ReallyAddictive? •  Are they coming more often? Mostly, but there is no stable effect •  But when they come, they stay… like with an addictive game Clicks - Per Session 0 5 10 15 20 Non-adaptive Adaptive Examples Quizzes Learning Objects - Per Session 0 1 2 3 4 Non-adaptive Adaptive Examples Quizzes
  • 30.
    Why It IsWorking? •  Progress-based annotation –  Displays the progress achieved so far –  Does it work as a reward mechanism? –  Open Student Modeling •  State-based annotation –  Not useful, ready, not ready –  Access activities in the right time –  Appropriate difficulty, keep motivation
  • 31.
  • 32.
    The Diversity ofWork •  C-Ratio: Measures the breadth of exploration •  Goal distance: Measures the depth Self-motivated Work - C-Ratio (%) 0 0.2 0.4 0.6 Non-adaptive Adaptive Quizzes Examples Self-motivated Work - Goal Distance (LO's) 0 5 10 15 20 Non-adaptive Adaptive Quizzes Examples
  • 33.
    Round 3: Tryinganother domain… •  Is it something relevant to C programming or to simple kind of content? •  New changes: –  SQL Programming instead of C –  Programming problems (code writing) instead of questions (code evaluation) –  Comparison of concept-based and topic-based mechanisms in the same domain and with the same kind of content
  • 34.
    •  SQL-KnoT deliversonline SQL problems, checks student’s answers and provides a corrective feedback •  Every problem is dynamically generated using a template and a set of databases •  All problems have been assigned to 1 of the course topics and indexed with concepts from the SQL ontology SQL Knowledge Tester
  • 35.
    •  To investigatepossible influence of concept-based adaptation in the present of topic-based adaptation we developed two versions of QuizGuide: Topic-based Topic-based+Concept-Based Concept-based vs Topic-based ANS
  • 36.
    •  Two DatabaseCourses (Fall 2007): §  Undergraduate (36 students) §  Graduate (38 students) •  Each course divided into two groups: §  Topic-based navigation §  Topic-based + Concept-Based Navigation •  All students had access to the same set of SQL- KnoT problems available in adaptive (QuizGuide) and in non-adaptive mode (Portal) Study Design
  • 37.
    •  Total numberof attempts made by all students: in adaptive mode (4081), in non-adaptive mode (1218) •  Students in general were much more willing to access the adaptive version of the system, explored more content with it and to stayed with it longer: Questions 0 25 50 75 100 Quizzes 0 5 10 15 20 25 Topics 0 1 2 3 4 5 6 Sessions 0 1 2 3 4 5 Session Length 0 5 10 15 20 25 Adaptive Non-adaptive It works again! Like magic…
  • 38.
    •  Did concept-basedadaptation increase the magnitude of the motivational effect? §  No significant difference in the average numbers of attempts, problems answered, topics explored §  No significant difference in the session length •  Was there any other observable difference? §  Average number of attempts per question §  Resulting Knowledge Level (averaged across all concepts) Attempts per question 0 2 4 6 Knowledge level 0 0.2 0.4 0.6 Combined Topic-based Concept-based ANS: Added Value?
  • 39.
    •  Question-based Patterns: • Topic-based Patterns: Repetition 0: incorrect previous attempt Repetition 1: correct previous attempt Sequence RepetitionGo-Back Skipping Next Topic Jump-Forward Jump-Backward Combined Pattern Analysis
  • 40.
    Topic Pattern Distribution 0% 10% 20% 30% 40% 50% 60% 70% 80% Next-TopicJump-Fwrd Jump-Bkwd Combined (undegraduate) Combined (graduate) Topic-based (undegraduate) Topic-based (graduate) Pattern Analysis (2)
  • 41.
    Question Pattern Distribution 0% 10% 20% 30% 40% 50% 60% SequenceGo-Back Skipping Repetition-0 Repetition-1 Combined (undegraduate) Combined (graduate) Topic-based (undegraduate) Topic-based (graduate) Pattern Analysis (3)
  • 42.
    •  Difference inthe ratio of Repetition1 pattern explains: §  difference in the average number of attempts per question §  difference in the cumulative resulting knowledge level §  Students repeat the same question again and again: §  They “get addicted“ to the concept-based icons §  Is it a good thing for us? − YES – they react to the navigational cues, they work more − NO – we expect them to concentrate on those questions where they have smaller progress instead of drilling in the same question Discussion
  • 43.
    Round 4: TheIssue of Complexity •  Let’s now try it for Java… •  What is the research goal? •  Java is a more sophisticated domain than C –  OOP versus Procedural –  Higher complexity •  Will it work for complex questions? •  Will it work similarly? 0% 20% 40% 60% 80% 100% C Java language complexity Easy Moderate Hard
  • 44.
  • 46.
  • 47.
    !! !! JavaGuide (Fall 2008) QuizJET (Spring2008) !! parameters (n=22) (n=31) Overall User Statistics Attempts 125.50 41.71 Success Rate 58.31% 42.63% Distinct Topics 11.77 4.94 Distinct Questions 46.18 17.23 Average User Session Statistics Attempts 30.34 21.50 Distinct Topics 2.85 2.55 Distinct Questions 11.16 8.88 Magic… Here We Go Again!
  • 48.
    •  Significantly moreAttempts on the easy questions in JavaGuide than in QuizJET •  F(1, 153) = 7.081, p = .009, partial η2 = .044 The Effect Depends on Complexity
  • 49.
    •  Significant higher SuccessRate •  F(1, 153) = 40.593, p .001, partial η2 = .210 •  On average 2.5 times more likely to answer a quiz correctly with adaptive navigation support 68.73% 67.00% 39.32%38.00% 28.23% 11.90% 0% 20% 40% 60% 80% 100% Easy Moderate Hard SuccessRate Complexity Level Success Rate JavaGuide QuizJET Different Pattern For Success Rate
  • 50.
    0.94 0.61 0.29 1.85 1.01 0.44 0 0.5 1 1.5 2 Easy Moderate Hard Attempts/Question ComplexityLevel Average Attempts per Question 68.73% 67.00% 39.32%38.00% 28.23% 11.90% 0% 20% 40% 60% 80% 100% Easy Moderate Hard SuccessRate Complexity Level Success Rate JavaGuide QuizJET Prerequisite-based guidance prepared students to attempt complex questions after exploring easier ones Putting it Together…
  • 51.
    Round 5: SocialNavigation •  Concept-based and topic-based navigation support work well to increase success and motivation •  Knowledge-based approaches require some knowledge engineering – concept/topic models, prerequisites, time schedule •  In our past work we learned that social navigation – guidance extracted from the work of a community of learners – might replace knowledge-based guidance •  Social wisdom vs. knowledge engineering
  • 52.
    Open Social StudentModeling •  Key ideas –  Assume simple topic-based design –  No prerequsites or concept modeling –  Show topic- and content- level knowledge progress of a student in contrast to the same progress of the class •  Main challenge –  How to design the interface to show student and class progress over topics? –  We went through several attempts
  • 53.
  • 54.
  • 55.
  • 56.
    Class vs. Peers • Peer progress was important, students frequently accessed content using peer models •  The more the students compared to their peers, the higher post-quiz scores they received (r= 0.34 p=0.004) •  Parallel IV didn’t allow to recognized good peers before opening the model •  Progressor added clear peer progress
  • 57.
    F(1,32)= 11.303, p.01 71.35% 42.63% 58.31% 0.00% 20.00% 40.00% 60.00% 80.00% 100.00% QuizJET+IVQuizJET+Portal JavaGuide Success Rate Success Rate Results: Success
  • 58.
    Why It IsImportant? •  Many systems demonstrated their educational effectiveness in a lab-like settings: once the students are pushed to use it - it benefits their learning •  However, once released to real classes, these systems are under-used - most of them offer additional non- mandatory learning opportunities •  “Students are only interested in points and grades” •  Convert all tools into credit-bearing activities? •  Or use alternative approaches to increase motivation
  • 59.
  • 60.
    The Value ofPeers 205.73 113.05 80.81 125.5 0 50 100 150 200 250 Attempts Progressor QuizJET+IV QuizJET+Portal JavaGuide 68.39% 71.35% 42.63% 58.31% 0.00% 20.00% 40.00% 60.00% 80.00% Success Rate Progressor QuizJET+IV QuizJET+Portal JavaGuide
  • 61.
  • 62.
    Take-home messages •  Acombination of progress-based and state- based adaptive link annotation increases the amount and the diversity of student work with non-mandatory educational content •  The effect is stable and the scale of it is quite large •  Properly organized Social Navigation might be at least as successful as the knowledge-based •  Requires a long-term classroom study to observe
  • 63.
    What we aredoing now? •  Exploring new generation of open social modeling tools in wide variety if classes and domains from US to Nigeria –  Interested to be a pilot site? •  Exploring more advanced guidance and modeling approaches based on large volume of social data •  Applying open social modeling to motivate readings
  • 64.
    Acknowledgements •  Joint workwith –  Sergey Sosnovsky –  Michael Yudelson –  Sharon Hsiao •  Pitt “Innovation in Education” grant •  NSF Grants –  EHR 0310576 –  IIS 0426021 –  CAREER 0447083
  • 65.
    Try It! •  http://adapt2.sis.pitt.edu/kt/ • Brusilovsky, P., Sosnovsky, S., and Yudelson, M. (2009) Addictive links: The motivational value of adaptive link annotation. New Review of Hypermedia and Multimedia 15 (1), 97-118. •  Hsiao, I.-H., Sosnovsky, S., and Brusilovsky, P. (2010) Guiding students to the right questions: adaptive navigation support in an E- Learning system for Java programming. Journal of Computer Assisted Learning 26 (4), 270-283. •  Hsiao, I.-H., Bakalov, F., Brusilovsky, P., and König-Ries, B. (2013) Progressor: social navigation support through open social student modeling. New Review of Hypermedia and Multimedia [PDF] Read About It!