In this talk we will analyze the effects of gamification in the social network of a large online course on ‘digital skills for teachers.’ Educational social networking websites and learning systems can gather information about contributions of participants and about the underlying social network. We will present an experimental gamification layer with three game elements (points, badges, and leaderboard) that was delivered to students. Social network analysis (SNA) and principal component analysis (PCA) can then be used to analyze the differences between groups using information about contributions to the website, and position and influence in the social network of each participant. Initial results suggest that variables and participants group differently, and that gamification may influence the structure of the social network of participants in the course. The first component (F1) can be a good descriptor of students’ work and position in the network that can be used to build predictive models of learning success. The models suggest that the probability of passing the course increases more rapidly in the experimental (gamified) groups for students that participate.
ECTEL 2018 Presentation: Investigating the relationship between online activity, learning strategies and grades to make learning analytics-supported learning designs
Blackboard’s data science team conducts large-scale analysis of the relationship between the use of our academic technologies and student impact, in order to inform product design, disseminate effective practices, and advance the base of empirical research in educational technologies.
In this presentation, John Whitmer, Director of Analytics & Research, will discuss findings from 2016. Some findings challenge our conventional knowledge, while others confirm what we believed to be true.
Archived presentation made to JISC Learning Analytics workgroup on Feb 22, 2017
Educational Data Mining in Program Evaluation: Lessons LearnedKerry Rice
AET 2016 Researchers present findings from a series of data mining studies, primarily examining data mining as part of an innovative triangulated approach in program evaluation. Findings suggest that is it possible to apply EDM techniques in online and blended learning classrooms to identify key variables important to the success of learners. Lessons learned will be shared as well as areas for improving data collection in learning management systems for meaningful analysis and visualization.
Slides from Keynote presentation at the University of Southern California's 2015 Teaching with Technology annual conference.
"9:15 am – ANN Auditorium
Key Note: What Do We Mean by Learning Analytics?
Leah Macfadyen, Director for Evaluation and Learning Analytics, University of British Columbia
Executive Board, SoLAR (Society for Learning Analytics Research)
Leah Macfadyen will define and explore the emerging and interdisciplinary field of learning analytics in the context of quantified and personalized learning. Leah will use actual examples and case studies to illustrate the range of stakeholders learning analytics may serve, the diverse array of questions they may be used to address, and the potential impact of learning analytics in higher education."
Using Social Network Analysis to Assess Organizational Development InitiativesStephanie Richter
Presented at 2016 POD Network conference #POD16
Many Faculty Development centers engage in far-reaching organizational development initiatives within their institutions. These initiatives are incredibly valuable but difficult to assess using traditional methods. Social network analysis (SNA) is a powerful visualization and statistical technique that has multiple applications in researching and assessing organizational development. In this session, learn how SNA was used at one institution to investigate the formation of community regarding online course quality standards as well as to analyze organizational structure for strategic planning. While this session focuses on organizational uses, examples will also be shared of applications for teaching and research.
Visualizing Community through Social Network AnalysisStephanie Richter
We introduce a lot of new initiatives to our campuses, such as innovative pedagogies, emerging technologies, and updated policies and procedures. Making these changes last requires building a community around the innovation, but it is difficult to know who is involved and how the innovation is adopted across campus. When the Northern Illinois University Office of Program Development and Support formed in 2014, we also introduced social network analysis to study how the online teaching community evolved over time. In this presentation, we will offer an overview of social network analysis, describe how we have implemented it at NIU, and share some of our initial findings.
This presentation was originally presented at the 2015 SLATE Conference.
ECTEL 2018 Presentation: Investigating the relationship between online activity, learning strategies and grades to make learning analytics-supported learning designs
Blackboard’s data science team conducts large-scale analysis of the relationship between the use of our academic technologies and student impact, in order to inform product design, disseminate effective practices, and advance the base of empirical research in educational technologies.
In this presentation, John Whitmer, Director of Analytics & Research, will discuss findings from 2016. Some findings challenge our conventional knowledge, while others confirm what we believed to be true.
Archived presentation made to JISC Learning Analytics workgroup on Feb 22, 2017
Educational Data Mining in Program Evaluation: Lessons LearnedKerry Rice
AET 2016 Researchers present findings from a series of data mining studies, primarily examining data mining as part of an innovative triangulated approach in program evaluation. Findings suggest that is it possible to apply EDM techniques in online and blended learning classrooms to identify key variables important to the success of learners. Lessons learned will be shared as well as areas for improving data collection in learning management systems for meaningful analysis and visualization.
Slides from Keynote presentation at the University of Southern California's 2015 Teaching with Technology annual conference.
"9:15 am – ANN Auditorium
Key Note: What Do We Mean by Learning Analytics?
Leah Macfadyen, Director for Evaluation and Learning Analytics, University of British Columbia
Executive Board, SoLAR (Society for Learning Analytics Research)
Leah Macfadyen will define and explore the emerging and interdisciplinary field of learning analytics in the context of quantified and personalized learning. Leah will use actual examples and case studies to illustrate the range of stakeholders learning analytics may serve, the diverse array of questions they may be used to address, and the potential impact of learning analytics in higher education."
Using Social Network Analysis to Assess Organizational Development InitiativesStephanie Richter
Presented at 2016 POD Network conference #POD16
Many Faculty Development centers engage in far-reaching organizational development initiatives within their institutions. These initiatives are incredibly valuable but difficult to assess using traditional methods. Social network analysis (SNA) is a powerful visualization and statistical technique that has multiple applications in researching and assessing organizational development. In this session, learn how SNA was used at one institution to investigate the formation of community regarding online course quality standards as well as to analyze organizational structure for strategic planning. While this session focuses on organizational uses, examples will also be shared of applications for teaching and research.
Visualizing Community through Social Network AnalysisStephanie Richter
We introduce a lot of new initiatives to our campuses, such as innovative pedagogies, emerging technologies, and updated policies and procedures. Making these changes last requires building a community around the innovation, but it is difficult to know who is involved and how the innovation is adopted across campus. When the Northern Illinois University Office of Program Development and Support formed in 2014, we also introduced social network analysis to study how the online teaching community evolved over time. In this presentation, we will offer an overview of social network analysis, describe how we have implemented it at NIU, and share some of our initial findings.
This presentation was originally presented at the 2015 SLATE Conference.
Data-Driven Education 2020: Using Big Educational Data to Improve Teaching an...Peter Brusilovsky
Modern educational settings from regular classrooms to MOOCs produce a a rapidly increasing volume of data that captures individual learning progress of millions of students at different level of granularity. This presence of this data opens a unique opportunity to re-engineer traditional education and build and develop a range of efficient data-driven approaches to support teaching and learning. In my talk, I will present several ways to use big educational data explored in our lab. The focus will be on open social learning modeling and identifying individual differences through sequential pattern mining, but several other approaches will be mentioned. Open social learning modeling and sequential pattern mining provides two considerably different examples on using educational data. One offers an immediate use of class interaction history to develop more engaging content access while another shows how big data could be used to uncover important individual differences that could be used to optimize the process for individual leaners.
A reflection on where we are with learning analytics as a new multi-discipline research area. Reflections from the Learning Analytics Conference 2013 with respect to Assessment.
Speakers:
David Lewis, senior analytics consultant, Jisc
Martin Lynch, learning systems manager, University of South Wales
An opportunity to find out about how an institution has been implementing learning analytics to support the student journey with and opportunity to discuss issues and possibilities that the use of learning analytics may create.
Keynote Address, Expanding Horizons 2012, Macquarie University
http://staff.mq.edu.au/teaching/workshops_programs/expanding_horizons
"Learning Analytics": unprecedented data sets and live data streams about learners, with computational power to help make sense of it all, and new breeds of staff who can talk predictive models, pedagogy and ethics. This means rather different things to different people: unprecedented opportunity to study, benchmark and improve educational practice, at scales from countries and institutions, to departments, individual teachers and learners. "Benchmarking" may trigger dystopic visions of dumbed down proxies for 'real teaching and learning', but an emu response is no good. For educational institutions, our calling is to raise the quality of debate, shape external and internal policy, and engage with the companies and open communities developing the future infrastructure. How we deploy these new tools rests critically on assessment regimes, what can be logged and measured with integrity, and what we think it means to deliver education that equips citizens for a complex, uncertain world.
Data-Driven Education 2020: Using Big Educational Data to Improve Teaching an...Peter Brusilovsky
Modern educational settings from regular classrooms to MOOCs produce a a rapidly increasing volume of data that captures individual learning progress of millions of students at different level of granularity. This presence of this data opens a unique opportunity to re-engineer traditional education and build and develop a range of efficient data-driven approaches to support teaching and learning. In my talk, I will present several ways to use big educational data explored in our lab. The focus will be on open social learning modeling and identifying individual differences through sequential pattern mining, but several other approaches will be mentioned. Open social learning modeling and sequential pattern mining provides two considerably different examples on using educational data. One offers an immediate use of class interaction history to develop more engaging content access while another shows how big data could be used to uncover important individual differences that could be used to optimize the process for individual leaners.
A reflection on where we are with learning analytics as a new multi-discipline research area. Reflections from the Learning Analytics Conference 2013 with respect to Assessment.
Speakers:
David Lewis, senior analytics consultant, Jisc
Martin Lynch, learning systems manager, University of South Wales
An opportunity to find out about how an institution has been implementing learning analytics to support the student journey with and opportunity to discuss issues and possibilities that the use of learning analytics may create.
Keynote Address, Expanding Horizons 2012, Macquarie University
http://staff.mq.edu.au/teaching/workshops_programs/expanding_horizons
"Learning Analytics": unprecedented data sets and live data streams about learners, with computational power to help make sense of it all, and new breeds of staff who can talk predictive models, pedagogy and ethics. This means rather different things to different people: unprecedented opportunity to study, benchmark and improve educational practice, at scales from countries and institutions, to departments, individual teachers and learners. "Benchmarking" may trigger dystopic visions of dumbed down proxies for 'real teaching and learning', but an emu response is no good. For educational institutions, our calling is to raise the quality of debate, shape external and internal policy, and engage with the companies and open communities developing the future infrastructure. How we deploy these new tools rests critically on assessment regimes, what can be logged and measured with integrity, and what we think it means to deliver education that equips citizens for a complex, uncertain world.
ICT are transforming Cuban higher education towards the adoption of blended-learning and distance learning. This dissertation focuses on investigating the effectiveness of using social software to support collaborative learning in a Cuban university. Five studies were conducted within three phases that included diagnostic, integration and validation of the social software that was used to support collaborative learning. A didactic model was created to integrate social software within Cuban teaching and learning in higher education. Social Network Analysis and content analysis were used to evaluate the effectiveness of social software to support students' learning through their collaborative learning relationships and through their posts in wiki pages and online discussions. Statistical analysis was used to evaluate students' self-efficacy as a measure of their achievements in social software-supported collaborative learning. The findings confirmed social software’s suitability to support collaborative learning, as it increased collaborative learning's effectiveness, compared to face-to-face collaborative learning. Specific findings were revealed for the use of wikis and online discussions within teaching and learning, which are extendable to other social software tools. A didactic model to integrate social software in Cuban teaching and learning, as well as a framework to analyse students' interactions, were used for first time and validated to extend its use among Cuban university stakeholders.
The Impact of Digital Literacy Practices on Learning Outcomes in Higher Educ...J'ette Novakovich
This paper reports the findings of a Stage I meta-analysis exploring the effectiveness of online digital literacy practices performed through social media tools in higher education classrooms as measured by learning outcomes. An extensive literature search culled more than 500 potential articles and resulted in a sample of 51 representative quasi-experimental studies, consisting of 4,630 total participants. Fifty-one effect sizes were extracted and yielded a moderately positive statistically significant weighted average effect size of g+ = 0.315, k=51, p <. 01. This overall effect size suggests that integrating online digital literacy practices into the higher education classroom benefits students on measures of academic achievement and offers significant learning support.
In addition, several moderator variables were tested to determine what factors and literacy practices impact learning outcomes; namely, field of study (STEM, ARTS), conceptualization of tool (social learning theory, delivery tool), peer interaction (yes, no), modality (blogs, collaborative communication text-based technologies, i.e. forums and wikis; podcasts, and virtual worlds), practice (consuming, prosuming behaviors), and the learning outcome measured (course project, knowledge-based exam); moderator effect sizes were statistically significant for the following variables: conceptualization of the tool, practice, and learning outcomes.
The aim of our study is to extract the profiles of students activities, performed during the training sessions of a course of logic networks, and to relate such activities with the students’ performance at intermediate verification tests. In this course, undergraduate students learn and practice the concepts of logic networks with Deeds Simulator.
The Deeds is a set of educational tools for digital electronics, which stands for "Digital Electronics Education and Design Suite". It is used in courses of Electronic Engineering at DITEN, UNIGE.
By applying learning analytics methods to the data captured from activity logs and questionnaires, we aim to understand the learning behavior of students.
This project was presented at Learning Analytics Data Sharing – LADS14 Workshop at EC-TEL.
An institutional perspective on analytics that focusses on a particular tool developed using an agile methodology to visualise learner behaviours in MOOCs via Sankey diagrams.
EMMA Summer School - Rebecca Ferguson - Learning design and learning analytic...EUmoocs
This hands-on workshop will work with learning design tools and with massive open online courses (MOOCs) on the FutureLearn platform to explore how learning design can be used to influence the choice and design of learning analytics. This workshop will be of interest to people who are involved in the design or presentation of online courses, and to those who want to find out more about learning design, learning analytics or MOOCs. Participants will find it helpful to have registered for FutureLearn and explored the platform for a short time in advance of the workshop.
This presentation was given during the EMMA Summer School, that took place in Ischia (Italy) on 4-11 July 2015.
More info on the website: http://project.europeanmoocs.eu/project/get-involved/summer-school/
Follow our MOOCs: http://platform.europeanmoocs.eu/MOOCs
Design and deliver your MOOC with EMMA: http://project.europeanmoocs.eu/project/get-involved/become-an-emma-mooc-provider/
Learning Analytics and how to use in educational or serious games for improving the use of the games
game traces
evidence based education
Talk at the Ecole Normal Superior, Lyon, France
Learning Analytics for online and on-campus education: experience and researchTinne De Laet
This presentation was used Tinne De Laet, KU Leuven, for a keynote presentation during the event: http://www.educationandlearning.nl/agenda/2017-10-13-cel-innovation-room-10-learning-and-academic-analytics organised by Leiden University, Erasmus University Rotterdam, and Delft University of Technology.
The presentations presents the results of two case studies from the Erasmus+ project ABLE and STELA, and provides 9 recommendations regarding learning analytics.
Show drafts
volume_up
Empowering the Data Analytics Ecosystem: A Laser Focus on Value
The data analytics ecosystem thrives when every component functions at its peak, unlocking the true potential of data. Here's a laser focus on key areas for an empowered ecosystem:
1. Democratize Access, Not Data:
Granular Access Controls: Provide users with self-service tools tailored to their specific needs, preventing data overload and misuse.
Data Catalogs: Implement robust data catalogs for easy discovery and understanding of available data sources.
2. Foster Collaboration with Clear Roles:
Data Mesh Architecture: Break down data silos by creating a distributed data ownership model with clear ownership and responsibilities.
Collaborative Workspaces: Utilize interactive platforms where data scientists, analysts, and domain experts can work seamlessly together.
3. Leverage Advanced Analytics Strategically:
AI-powered Automation: Automate repetitive tasks like data cleaning and feature engineering, freeing up data talent for higher-level analysis.
Right-Tool Selection: Strategically choose the most effective advanced analytics techniques (e.g., AI, ML) based on specific business problems.
4. Prioritize Data Quality with Automation:
Automated Data Validation: Implement automated data quality checks to identify and rectify errors at the source, minimizing downstream issues.
Data Lineage Tracking: Track the flow of data throughout the ecosystem, ensuring transparency and facilitating root cause analysis for errors.
5. Cultivate a Data-Driven Mindset:
Metrics-Driven Performance Management: Align KPIs and performance metrics with data-driven insights to ensure actionable decision making.
Data Storytelling Workshops: Equip stakeholders with the skills to translate complex data findings into compelling narratives that drive action.
Benefits of a Precise Ecosystem:
Sharpened Focus: Precise access and clear roles ensure everyone works with the most relevant data, maximizing efficiency.
Actionable Insights: Strategic analytics and automated quality checks lead to more reliable and actionable data insights.
Continuous Improvement: Data-driven performance management fosters a culture of learning and continuous improvement.
Sustainable Growth: Empowered by data, organizations can make informed decisions to drive sustainable growth and innovation.
By focusing on these precise actions, organizations can create an empowered data analytics ecosystem that delivers real value by driving data-driven decisions and maximizing the return on their data investment.
StarCompliance is a leading firm specializing in the recovery of stolen cryptocurrency. Our comprehensive services are designed to assist individuals and organizations in navigating the complex process of fraud reporting, investigation, and fund recovery. We combine cutting-edge technology with expert legal support to provide a robust solution for victims of crypto theft.
Our Services Include:
Reporting to Tracking Authorities:
We immediately notify all relevant centralized exchanges (CEX), decentralized exchanges (DEX), and wallet providers about the stolen cryptocurrency. This ensures that the stolen assets are flagged as scam transactions, making it impossible for the thief to use them.
Assistance with Filing Police Reports:
We guide you through the process of filing a valid police report. Our support team provides detailed instructions on which police department to contact and helps you complete the necessary paperwork within the critical 72-hour window.
Launching the Refund Process:
Our team of experienced lawyers can initiate lawsuits on your behalf and represent you in various jurisdictions around the world. They work diligently to recover your stolen funds and ensure that justice is served.
At StarCompliance, we understand the urgency and stress involved in dealing with cryptocurrency theft. Our dedicated team works quickly and efficiently to provide you with the support and expertise needed to recover your assets. Trust us to be your partner in navigating the complexities of the crypto world and safeguarding your investments.
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...John Andrews
SlideShare Description for "Chatty Kathy - UNC Bootcamp Final Project Presentation"
Title: Chatty Kathy: Enhancing Physical Activity Among Older Adults
Description:
Discover how Chatty Kathy, an innovative project developed at the UNC Bootcamp, aims to tackle the challenge of low physical activity among older adults. Our AI-driven solution uses peer interaction to boost and sustain exercise levels, significantly improving health outcomes. This presentation covers our problem statement, the rationale behind Chatty Kathy, synthetic data and persona creation, model performance metrics, a visual demonstration of the project, and potential future developments. Join us for an insightful Q&A session to explore the potential of this groundbreaking project.
Project Team: Jay Requarth, Jana Avery, John Andrews, Dr. Dick Davis II, Nee Buntoum, Nam Yeongjin & Mat Nicholas
As Europe's leading economic powerhouse and the fourth-largest hashtag#economy globally, Germany stands at the forefront of innovation and industrial might. Renowned for its precision engineering and high-tech sectors, Germany's economic structure is heavily supported by a robust service industry, accounting for approximately 68% of its GDP. This economic clout and strategic geopolitical stance position Germany as a focal point in the global cyber threat landscape.
In the face of escalating global tensions, particularly those emanating from geopolitical disputes with nations like hashtag#Russia and hashtag#China, hashtag#Germany has witnessed a significant uptick in targeted cyber operations. Our analysis indicates a marked increase in hashtag#cyberattack sophistication aimed at critical infrastructure and key industrial sectors. These attacks range from ransomware campaigns to hashtag#AdvancedPersistentThreats (hashtag#APTs), threatening national security and business integrity.
🔑 Key findings include:
🔍 Increased frequency and complexity of cyber threats.
🔍 Escalation of state-sponsored and criminally motivated cyber operations.
🔍 Active dark web exchanges of malicious tools and tactics.
Our comprehensive report delves into these challenges, using a blend of open-source and proprietary data collection techniques. By monitoring activity on critical networks and analyzing attack patterns, our team provides a detailed overview of the threats facing German entities.
This report aims to equip stakeholders across public and private sectors with the knowledge to enhance their defensive strategies, reduce exposure to cyber risks, and reinforce Germany's resilience against cyber threats.
Social network analysis of large online gamified courses
1. Social network analysis of large
online gamified courses
Luis de Marcos Ortega, Ass. Prof.
Computer Science Department
Universidad de Alcalá
luis.demarcos@uah.es
https://goo.gl/sBjvw1
1
Time4Science. July 2018. Varaždin
Faculty of Organization and Informatics. Varaždin. Univ. Zagreb.
4. Background and Related Work
Gamification
• Gamification is the use of game elements in non-game
contexts to promote participation and motivate action
(Deterding et al., 2011; Werbach and Hunter, 2012).
• Framing activities as a game through game elements,
like points, badges, and leaderboards, holds as much
psychological power as the full game mechanics
(Lieberoth, 2014).
• Gamification increases users’ performance for simple
repetitive tasks (Mekler et al., 2013) but findings of its
motivational effects are contradictory (Hanus and Fox,
2015; Mekler et al., 2017).
• Effectiveness is also in question, pointing to the
necessity to align gamification with the goal of the
activity and to address psychological needs of users at
design time (van Roy and Zaman, 2017).
4
5. Background and Related Work
Gamification in education
• Educators are trying to harness the potential of
gamification to design motivating learning
experiences.
• Education is the most common context in which
gamification is implemented and reported (Hamari et
al., 2014; Seaborn and Fels, 2015; Martí-Parreño et
al., 2016).
• Existing research presents mixed results regarding the
impact of gamified elements such as badges, points,
and leaderboards in learning and affective outcomes.
• For instance positive effect on practical assignments but
a negative effect on conceptual learning (Domínguez et
al., 2013)
5
6. Background and Related Work
Social Networking in Education
• Important limitation: most of the existing
research on the utility and effectiveness of social
media in higher education is limited to self-
reported data (e.g., surveys, questionnaires) and
content analyses (Tess, 2013)
• Educational social networking sites provide data
for analysis, like the contributions of participants
and connections between them Social network
analysis (SNA)
• Application of SNA to e-Learning environments is
at a very early stage, although the number of
studies is increasing (Cela et al., 2015)
6
7. Background and Related Work
Gamification + Social Networking (in Education)
• The number of studies that bring together
gamification and social networking is limited
• Social gamification framework to assist teachers
in creating motivational learning experiences
fitted to learners’ needs (Simões et al., 2013)
• No analysis of results
• In a series of studies de-Marcos et al. compared
gamification and social networking concluding
that both yielded similar results regarding
learning performance in an undergraduate
course (de-Marcos et al., 2014; de-Marcos et al.,
2016a; de-Marcos et al., 2016b)
7
8. Objectives
1. Scrutinize the structure of the underlying social
network in large online courses
2. Analyze the effect of gamification in the structure of the
social network on large-scale online courses
3. Study the impact of position & influence (in SN),
contributions (in SNS) & game elements on learning
success in large-scale online courses
• Predictive models of student success
8
9. Setting
• Undergraduate online course “Digital Skills for
Teachers” (free offering in Portuguese)
• MOOC approach
• autonomous and independent learning with a
strong emphasis on the social and collaborative
dimension.
• 4 ECTS. 6 weeks. Syllabus:
1. Searching and sharing online resources
2. Using digital tools in the classroom
3. Promoting collaborative learning using digital
tools
• 3 editions: 363, 427 and 591 students
9
11. Methods
Instruments
•Functionality SNS:
• Social networking (all groups)
• news, learning guide, dashboard, blogs,
bookmarks and internal twitting
• Gamification layer (experimental group)
• thirteen achievements, points, and a
leaderboard
•Technical implementation: Moodle + Elgg
11
12. Methods
Measures
• Social Network Analysis (SNA)
• Network measures can be used to analyze the
social interactions and the structure of the
network, as well as changes over time
• Hypothesis: Gamification influences contributions
and social network
• Measures of each individual participant
represent her position and influence in the
network
• Effects of gamification at the level of each individual
• Analyze learning performance in relation to
contribution, position and influence in the network
12
13. Methods
Measures
• Individual network metrics
• Degree, Closeness centrality, Eccentricity, Betweenness
centrality, Clustering coefficient, Eigenvector centrality
• Link analysis: PageRank, authority, and hub
• Overall network metrics
• Avg. degree, Graph density and Avg. path length
• Participation metrics
• blogs, tweets, likes, messages (to other participants),
comments (to any publication), followers, following,
logins and total interactions
• number of achievements and points earned
(experimental group only)
• Learning performance: Passed or not passed
13
16. Results
Structure of the social network
16
Experimental Control
Nodes (including teachers) 600 437
Edges 3200 715
Avg. degree 5.33 1.64
Graph density .009 .004
Avg. path length 2.29 2.59
Avg. clustering coefficient .49 .36
Nodes with degree<>0
(at least 1 connection)
313 (53%) 167 (39%)
Students that passed 31 (5.25%) 14 (3.28%)
17. Results
Predictive models of learning success
• F1 (first component) is good a descriptor of
global student activity:
• Experimental group
• Includes 18/20 variables (measures)
• Describes 68% of variability
• Control group
• Includes 16/18 variables (measures)
• Describes 55% of variability
• Use F1 as predictor of learning success
17
19. Results
Predictive models of learning success
•Problems (MOOCs…):
• Many of the students that registered carry
out little or no work at all
• Dichotomous variable (passed/not
passed) to measure learning success
• Students that do a lot of work (activity) do
not earn the certificate
• Students with little activity pass
19
20. Results
Predictive models of learning success
• For better charaterization Introduce a new
measure called success probability (probability of
getting the certificate)
• Order the dataset by F1 (or any other predictor variable)
• sample the values above and below a given value taking
n values that create a sliding window of samples
• where A(j) is success at point j (A(j)=1) or not (A(j)=0)
20
22. Results
Predictive models of learning success
22
y = 0.0282x + 0.0346
R² = 0.9107
-0.05
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
0.000 2.000 4.000 6.000 8.000 10.000
ProbCertificate(n=21)
F1
Control
23. Results
Predictive models of learning success
• Probability of getting the certificate increases more
rapidly in the experimental group for students that
engage getting higher scores in the variables that
are part of F1
• Gamification seems to mediate in learning success
through increased participation
• Game elements > Participation > Success
23
24. Discussion
• Gamifaction fosters connections / many nodes with
an important number of connections (hubs).
• redistribution in the flow of communication between
students that widens and changes the patterns of
participation among participants (Aviv et al., 2003)
• Personal activity and structural centrality in the
educational social network are correlated (Klein et
al., 2015).
• Variability explained by the first component (F1) is
higher for experimental group
• gamification contributes to explain students’ work,
providing a better statistical description
24
25. Discussion
• Previous studies point to correlations between
social networking and learning success (Cho et al.,
2007; Thoms, 2011), and between gamification-
driven social networking and learning success (de-
Marcos et al., 2017; de-Marcos et al., 2016b) which
our study confirms (MOOC course).
• F1, besides being a good representation of
student’s work and position in the network, it is
also a good estimate of the probability of success.
25
26. Conclusions
• Gamification influences the final structure of the
social network as measured by network metrics
and individual connections of participants
• Network metrics and measures of participation are
also a good representation of student work that
facilitate building predictive models of the
probability of success for students
• Predictive models show that students in the
experimental condition (gamification) have a higher
probability of passing the course getting a
certificate if they participate.
26
27. Limitations
• Causal relationship between the effect of social
networking and gamification in learning
performance is not proven
• Quasi-experimental design
• Generalization
• Three cohorts of students and three social networks are
studied
• Particular educational setting
27
28. Future Work
• Confirm effect of gamification in
participation
• e.g. clustering / reduction of dimensionality
• Alternative data analysis (e.g. data mining)
• Mobile learning and augmented reality
• Motivations & player types
• Other constructs/scales (e.g. competition vs
collaboration)
28
29. References
AVIV, R., ERLICH, Z., RAVID, G. & GEVA, A. 2003. Network Analysis of Knowledge Construction in Asynchronous Learning
Networks. Journal of Asynchronous Learning Networks, 7, 1-23.
CELA, K. L., SICILIA, M. Á. & SÁNCHEZ, S. 2015. Social Network Analysis in E-Learning Environments: A Preliminary
Systematic Review. Educational Psychology Review, 27, 219-246.
CHO, H., GAY, G., DAVIDSON, B. & INGRAFFEA, A. 2007. Social networks, communication styles, and learning
performance in a CSCL community. Computers & Education, 49, 309-329.
DE-MARCOS, L., DOMÍNGUEZ, A., SAENZ-DE-NAVARRETE, J. & PAGÉS, C. 2014. An empirical study comparing
gamification and social networking on e-learning. Computers & Education, 75, 82-91
DE-MARCOS, L., GARCIA-LOPEZ, E. & GARCIA-CABOT, A. 2016. On the Effectiveness of Game-like and Social Approaches
in Learning: Comparing Educational Gaming, Gamification & Social Networking. Computers & Education, 95, 99-113.
DE-MARCOS, L., GARCÍA-LÓPEZ, E., GARCÍA-CABOT, A., MEDINA-MERODIO, J.-A., DOMÍNGUEZ, A., MARTÍNEZ-HERRÁIZ,
J.-J. & DIEZ-FOLLEDO, T. 2016. Social network analysis of a gamified e-learning course: Small-world phenomenon and
network metrics as predictors of academic performance. Computers in Human Behavior, 60, 312-321.
DE-MARCOS, L., GARCIA-CABOT, A. & GARCIA-LOPEZ, E. 2017. Towards the Social Gamification of e-Learning: a Practical
Experiment. International Journal of Engineering Educadion, 33, 66-73.
DETERDING, S., DIXON, D., KHALED, R. & NACKE, L. 2011. From game design elements to gamefulness: defining
"gamification". Proceedings of the 15th International Academic MindTrek Conference: Envisioning Future Media
Environments. Tampere, Finland: ACM.
DOMÍNGUEZ, A., SAENZ-DE-NAVARRETE, J., DE-MARCOS, L., FERNÁNDEZ-SANZ, L., PAGÉS, C. & MARTÍNEZ-HERRÁIZ, J.-J.
2013. Gamifying learning experiences: Practical implications and outcomes. Computers & Education, 63, 380-392.
HAMARI, J., KOIVISTO, J. & SARSA, H. 2014. Does Gamification Work? – A Literature Review of Empirical Studies on
gamification. 47th Hawaii International Conference on System Sciences. Hawaii, USA. 29
30. References
HANUS, M. D. & FOX, J. 2015. Assessing the effects of gamification in the classroom: A longitudinal study on intrinsic motivation,
social comparison, satisfaction, effort, and academic performance. Computers & Education, 80, 152-161.
KLEIN, A., AHLF, H. & SHARMA, V. 2015. Social activity and structural centrality in online social networks. Telematics and
Informatics, 32, 321-332.
LIEBEROTH, A. 2014. Shallow Gamification: Testing Psychological Effects of Framing an Activity as a Game. Games and Culture.
MARTÍ-PARREÑO, J., MÉNDEZ-IBÁÑEZ, E. & ALONSO-ARROYO, A. 2016. The use of gamification in education: a bibliometric and
text mining analysis. Journal of Computer Assisted Learning, 32, 663-676.
MEKLER, E. D., BRÜHLMANN, F., OPWIS, K. & TUCH, A. N. 2013. Do points, levels and leaderboards harm intrinsic motivation?:
an empirical analysis of common gamification elements. Proceedings of the First International Conference on Gameful Design,
Research, and Applications. Toronto, Ontario, Canada: ACM.
MEKLER, E. D., BRÜHLMANN, F., TUCH, A. N. & OPWIS, K. 2017. Towards understanding the effects of individual gamification
elements on intrinsic motivation and performance. Computers in Human Behavior, 71, 525-534.
SEABORN, K. & FELS, D. I. 2015. Gamification in theory and action: A survey. International Journal of Human-Computer Studies,
74, 14-31.
SIMÕES, J., REDONDO, R. D. & VILAS, A. F. 2013. A social gamification framework for a K-6 learning platform. Computers in
Human Behavior, 29, 345–353.
TESS, P. A. 2013. The role of social media in higher education classes (real and virtual) – A literature review. Computers in Human
Behavior, 29, A60-A68.
THOMS, B. 2011. A Dynamic Social Feedback System to Support Learning and Social Interaction in Higher Education. IEEE
Transactions on Learning Technologies, 4, 340-352.
VAN ROY, R. & ZAMAN, B. 2017. Why Gamification Fails in Education and How to Make it Successful: Introducing Nine
Gamification Heuristics Based on Self-Determination Theory. In: MA, M. & OIKONOMOU, A. (eds.) Serious Games and
Edutainment Applications, Volume II. Chan, Switzerland: Springer.
WERBACH, K. & HUNTER, D. 2012. For the win: How game thinking can revolutionize your business, Philadelphia, Wharton
Digital Press.
30
31. Social network analysis of large
online gamified courses
Luis de Marcos Ortega, Ass. Prof.
Computer Science Department
Universidad de Alcalá
luis.demarcos@uah.es
https://goo.gl/sBjvw1
31
Time4Science. July 2018. Varaždin
Faculty of Organization and Informatics. Varaždin. Univ. Zagreb.
Editor's Notes
Repeated the experiment with different experimental group:
-Reduction of dimensionality (factor loadings) Vbles group similarly
-Clustering (factor scores) Participants group similarly
This is for experimental group, but all groups are similar
-n data points from the plots are lost, n/2 points at the beginning of the plot an n/2 at the end
-n (sample) is an arbitrary value tests were made with different values of n such as 11, 21 and 31, and it was found that the behavior of the probability estimate was about the same, showing stability in the estimate