This article presents some lessons learned regarding the analysis of interactional data from an online course that provided certified basic level in Spanish language (UFAL Línguas - Espanhol). The data was collected after the end of the course, and concerned the students’ interactions with the learning environment’s educational resources, that were represented and stored using ontologies, and used by the Pedagogical Recommendation Process. This process aims to detect pedagogical practices happening in the classroom, discover the patterns responsible for these practices, create recommendations to improve the students’ performance, and monitor and evaluate if the process is working appropriately. In the end of the analysis, we identified that a considerable amount of dropouts, and other students who were very close to approval, failed. The results showed that if we had used the Pedagogical Recommendation Process during the progress of the course, we could have rescued some of these dropouts and assisted some who failed.
Internshala Student Partner 6.0 Jadavpur University Certificate
UFCG - SAC2014 - Lessons Learned from an Online Open Course: A Brasilian Case Study
1. LESSONS LEARNED FROM AN ONLINE OPEN
COURSE: A BRAZILIAN CASE STUDY
Ranilson Oscar Araújo Paiva
Dr. Ig Ibert Bittencourt Santana Pinto
SYMPOSIUM OF APPLIED COMPUTING 2014
GYEONGJU, SOUTH KOREA
2. Agenda
• The Online Open Course
• Problem Found
• Investigation
• Lessons Learned
• Conclusion
• References
2/27Ranilson Paiva | SAC2014 - GyeongJu | Lessons Learned from an Online Open Course: A Brazilian Case Study
3. • Online Environment for
Language Learning
• Developed by the Federal
University of Alagoas – Brazil
• Offers Online free courses
• Gamified Environment
• Used Ontologies
3/27
The Online Open Course
Ranilson Paiva | SAC2014 - GyeongJu | Lessons Learned from an Online Open Course: A Brazilian Case Study
4. The Online Open Course
• UFAL Línguas Espanhol
– 2075 registration forms
• 200 submissions were accepted
– 100 Students from the University (UFAL)
– 100 Students from public schools (High School)
– 5 monthes course (October 2012 to February 2013)
– 3 Modules (With 2 Units Each)
• All classes were online
• Some exams were held in classroom
• 1 Teacher + 8 Tutors
– 780 megabytes of interactional data
• 1,200,000 RDF Triples
4/27Ranilson Paiva | SAC2014 - GyeongJu | Lessons Learned from an Online Open Course: A Brazilian Case Study
5. The Online Open Course
5/27Ranilson Paiva | SAC2014 - GyeongJu | Lessons Learned from an Online Open Course: A Brazilian Case Study
6. The Online Open Course
6/27Ranilson Paiva | SAC2014 - GyeongJu | Lessons Learned from an Online Open Course: A Brazilian Case Study
7. The Online Open Course
7/27Ranilson Paiva | SAC2014 - GyeongJu | Lessons Learned from an Online Open Course: A Brazilian Case Study
8. The Online Open Course
8/27Ranilson Paiva | SAC2014 - GyeongJu | Lessons Learned from an Online Open Course: A Brazilian Case Study
12. Investigation
12/27Ranilson Paiva | SAC2014 - GyeongJu | Lessons Learned from an Online Open Course: A Brazilian Case Study
Research Questions
– What kind of interactions led students to fail?
– What kind of interactions led students to succeed?
– How can we detect these interactions?
– How can we help teachers reacting to them?
13. Investigation
13/27Ranilson Paiva | SAC2014 - GyeongJu | Lessons Learned from an Online Open Course: A Brazilian Case Study
According to [MORAN, 2006], students’
participation can be evaluated by
analyzing their interactions with forum,
chat, blogs and lists.
14. Investigation
14/27Ranilson Paiva | SAC2014 - GyeongJu | Lessons Learned from an Online Open Course: A Brazilian Case Study
Educational Resources
• Exercises
– Check topics’ understanding
– No points awarded
• Mock Tests (Tests)
– Check unit’s learning
– No points awarded
• Activities
– Proposed by the teachers
– Awarded points
• Exams
– In-classroom and Mandatory
– Awarded points
15. Investigation
15/27Ranilson Paiva | SAC2014 - GyeongJu | Lessons Learned from an Online Open Course: A Brazilian Case Study
Interactional Points
(Educational Resources)
• Accesses*
• Videos*
• Forum Interactions
• Chat Interactions
• Exercises
– Answered and Correct
– Weights (easy = 1, medium = 3 and difficult = 5)
• Tests
– Answered and Correct
• Activities
– Answered and Correct
• In-Classroom Exams*
18. Investigation
18/27Ranilson Paiva | SAC2014 - GyeongJu | Lessons Learned from an Online Open Course: A Brazilian Case Study
• Low Performance Interactions
– 0 to 1499 points
– 120 Students
– All failed
• Medium Performance Interactions
– 1500 to 3499 points
– 63 Students
– 40 failed / 23 succeeded
• High Performance Interactions
– 3500 to 5000 points
– 14 Students
– All succeeded
19. Investigation
19/27Ranilson Paiva | SAC2014 - GyeongJu | Lessons Learned from an Online Open Course: A Brazilian Case Study
According to [ROMERO, 2011],
Educational Data Mining (EDM) has the
objective of finding relevant information
(detect significant patterns) from big
amounts of data (educational context).
22. Lessons Learned
22/27Ranilson Paiva | SAC2014 - GyeongJu | Lessons Learned from an Online Open Course: A Brazilian Case Study
• What kind of interactions led students to fail?
– Low quality interactions
• Avoiding more difficult exercises
• Avoiding tests
• Avoiding activities proposed by the teacher
• Avoiding social interactions (chat and forum)
• What kind of interactions led students to succeed?
– Good quality interactions
• Just the opposite of low quality interactions
• How can we detect these interactions?
– Classify students according to the way their interactions with educational resources
• How can we help teachers reacting to them?
– Warn teachers/tutors when these behaviors are detected
– Identify why students’ interaction quality is low
– Recommend appropriate actions to overcome the pedagogical issues found
23. Conclusion
23/27Ranilson Paiva | SAC2014 - GyeongJu | Lessons Learned from an Online Open Course: A Brazilian Case Study
• Teachers need assistance to help students
– Considerable amount of data
– Reports and graphs do not provide quick help
• Students need help
– Quick
– Personalized to their needs
• The learning environment
– Prepared to identify pedagogial issues
– Prepared to take reactive or preventive measures
24. References
1. BAYER, Jaroslav; BUDZOVSKA, Hana; GERYK, Jan; OBSIVAC, Tomás;
POPELINSKY, Lubomir. Predicting drop-out from social behaviour of students.
Educational Data Mining Conference, 2012.
2. HUANG, Shaobo; FANG, Ning. A Work in Progress: Early Prediction of
Students' Academic Performance in an Introductory Engineering Course
Through Different Mathematical Modeling Techniques. Frontiers in
Education, 2012.
3. KANELLOPOULOS, Dimitris; KOTSIANTIS, Sotiris. Towards an ontology-based
system for intelligent prediction of student dropouts in distance education.
International Journal of Management in Education. Vol 2, No. 2, 2008.
4. MORAN, José Manoel. O que aprendi sobre avaliação em cursos semi-
presenciais. In: SILVA, Marco; SANTOS, Edméa (Orgs). Avaliação da
Aprendizagem em Educação Online. São Paulo: Loyola, 2006. Available at:
http://www.metodista.br/atualiza/conteudo/cursos/docentes/semana-de-
capacitacao-docente-fev-20111/arquivos/avaliacao-cursos-semi.pdf.
5. PAIVA, Ranilson O. A.; BITTENCOURT, Ig Ibert; SILVA, Alan Pedro, 2013. Uma
Ferramenta para a Recomendação Pedagógica Baseada em Mineração de
Dados Educacionais. Federal University of Alagoas- Programa de Pós-
Graduação em Modelagem Computacional do Conhecimento. Maceió. Brasil.
6. ROMERO, Cristóbal; VENTURA, Sebastian; PECHENIZKIY; BAKER, Ryan.
Handbook of Educational Data Mining. Florida: CRC Press, 2011.
25. Authors
Daniel Borges
Federal University of Alagoas
Jário Santos
Federal University of Alagoas
Dr. Ig Ibert Bittencourt
Federal University of Alagoas
Dr. Alan Pedro da Silva
Federal University of Alagoas
Student Panel
Displays Modules and Units, their deadlines as well as functions like:
Create Mock Tests
Evaluations
Study
Reports
Sub-tree representation (The course learning path)
Present the pedagogical content tree
Talk about the different educational resources (chat, forum, exercises, etc.)
Histogram comparing failures and approvals (absolute numbers).
The number of disapproved students signed that something was wrong (18.5% of approved students, only)
A course on bioelectricity offered in Coursera
The problem is relevant and it happens in other environments.
Number of students for each event
This graph shows that from the 12725 students registered, only 313 sucessfully concluded the course (about 2.46%).
Based on these findinds raised the following research questions:
Research Questions:
What kind of interactions lead students to failing?
How can we detect these interactions and react to them?
How can we make these actions transparent to teachers/tutors?
Expanding Moran’s definition we have used other educational resources in order to detect the way students interact with the learning environment.
Exams were presential activities, in order to confirm students’ id, among other reasons,
Exams were not considered, because they were not online tasks.
Boxplot: checking for outliers in accesses and points in tests data
We collected data regarding students’ interaction with the educational resources.
We checked the data for extreme values.
We treated the data, removing extreme values (outliers – identified as test data), null and missing values (they would interfere in the classification tree algorithm)
Number of students per 500 interaction points bin
We grouped students according to their performance/final score.
We observed three distinct groups, each one representing a practice:
In practice 1 the results are considerably below the expected, resulting in quitting (stopped accessing before the end of the course) and failing.
In practice 2 we find moderate results, however insufficient for approval. Some students, if appropriately helped, may achieve approval.
In practice 3 we find the students who successfully finished the course and those with the best performance. It is possible to analyse their interactions in order to guide other students towards successful interactions.
Low Interaction: from 0 to 1499 points (120 instances)
Medium Interaction: from 1500 to 3499 points (63 instances)
High Interaction: from 3500 to 5000 points (14 instances)
According to [MORAN, 2006] some practices can be explained by the way students interact with the educational resources (hypothesis: the better students interact with the educational resources, the better their performances are).
The Interaction Performance was used as a class
Low Interaction: from 0 to 1499 points (120 instances)
- Only 2 students did nothing
- Watched videos, did a few exercises, no mock tests and no activities done
Medium Interaction: from 1500 to 3499 points (63 instances)
- Watched videos, did some exercises, a few mock tests and a few activities
High Interaction: from 3500 to 5000 points (14 instances)
- Watched videos, did most/all exercises, did some mock tests and did most/all activities