Systematizing Game Learning Analytics
for Improving Serious Games Lifecycle
Baltasar Fernandez-Manjon
balta@fdi.ucm.es , @BaltaFM
e-UCM Research Group , www.e-ucm.es
Educación Digital y Juegos Serios
Cátedra Telefónica-Complutense
Serious Games
• Any use of digital games with purposes other than
entertainment (Michael & Chen, 2006)
• Applied successfully in many domains (medicine,
business) with different purposes (knowledge,
awareness, behavioural change)
• But still is a low adoption of Serious Games in
mainstream education
• More evidence-support about SG efficacy is
needed
Clark Abt, 1970
https://ig.ft.com/uber-game/
https://getbadnews.com/
https://www.re-mission2.org/
https://www.play2prevent.org/
http://burymemylove.arte.tv/
Do serious games actually work?
- Very few SG have a formal evaluation (e.g., pre-post)
- Usually tested with a very limited number of users
- Formal evaluation could be as expensive as creating the
game (or even more expensive)
- Evaluation is not yet considered as a strong requirement
- Difficult to deploy games in the classroom
- Teachers have very little info about what is happening when a
game is being used
- what the student has learned from playing the game?
Serious Games for bullying & cyberbullying
A. Calvo-Morata, C. Alonso-Fernández, M. Freire-Morán, I. Martínez-Ortiz and B. Fernández-Manjón, Serious games to prevent and
detect bullying and cyberbullying: a systematic serious games and literature review in Computers & Education, 8 July 2020
Serious Games for bullying & cyberbullying
• Only 14/32 games with user validation
• Validation: average < 300 users per game
• Very few collect user interactions
• Various target groups
• They address the problem in a variety of ways
• 20 Bullying, 7 Cyberbullying, 5 Both problems, work on
empathy, raise awareness, show prevention strategies,
safe internet use, report harassment, identify problem,
change behavior, develop emotional/social skills
• No open games
• Lack of free access to the game resources and its code
A. Calvo-Morata, C. Alonso-Fernández, M. Freire-Morán, I. Martínez-Ortiz and B. Fernández-Manjón, Serious games to prevent and
detect bullying and cyberbullying: a systematic serious games and literature review in Computers & Education, 8 July 2020
Most common methodology are pre-post questionaries in experiments:
Formal validation of serious games
Is there a significant
difference between
pre-questionary and
post-questionary
results?
Pre and post questionaries should
have been previously validated
Learning analytics & Game Analytics
• Learning analytics: Improving education based
on analysis of actual data
• Data driven
• From only theory-driven to evidence-based
• Game Analytics: Application of analytics to
game development and research (Telemetry)
• Game metrics
• Interpretable measures of data related to games
• Player behavior
• Mainly used with “commercial purposes”
• monetization, churn, user funnels
Game Learning Analytics
breaking the game black box
model to obtain information
while students play.
Manuel Freire, Ángel Serrano-Laguna, Borja Manero, Iván Martínez-Ortiz, Pablo Moreno-Ger, Baltasar Fernández-Manjón (2016): Game
Learning Analytics: Learning Analytics for Serious Games. In Learning, Design, and Technology (pp. 1–29). Cham: Springer International
Publishing. http://doi.org/10.1007/978-3-319-17727-4_21-1.
•GLA is learning analytics applied to serious games
• collect, analyze and visualize data from learners’ interactions
with SGs
Game Learning Analytics (GLA)
Uses of Game Learning Analytics in Serious
Games Lifecycle
• Game testing – game analytics
• Player focus – user experience
• Average playing time, completion rate
• Game deployment and student evaluation
• Real-time information for supporting the teacher
• Knowing what is happening when the game is deployed
in the class
• “Stealth” student evaluation (Valerie Shute)
• Formal Game evaluation – game effectivity
• From pre-post questionaries to evaluation based on
game learning analytics??
GLA
From Formal Game Validation to Game Learning
Analytics
PRE POST
Experimental
group
Real-time
analysis
Off-line
analysis
User control
Session control
Game efficacy
User acceptance
Design validation
Minimun Game Requirements for GLA
• Most of games are black boxes.
• No access to what is going on during game play
• We need access to game “guts”
• User interactions
• Changes of the game state or game variables
• Or the game must communicate with the outside world
• Using some logging framework
• What is the meaning of the that data?
• Ethics: adequate experimental design and setting
• Are users informed?
• Anonymization of data could be required
• Fair data exploitation for all stakeholders?
Game Learning Analytics (GLA) or Informagic?
• Informagic
• False expectations of gaining full insight on the game
educational experience based only on very shallow game
interaction data
• Set more realistic expectations about learning analytics with
serious games
• Requirements
• Outcomes
• Uses
• Cost/Complexity
Perez-Colado, I. J., Alonso-Fernández, C., Freire-Moran, M., Martinez-Ortiz, I., & Fernández-Manjón, B. (2018). Game
Learning Analytics is not informagic! In IEEE Global Engineering Education Conference (EDUCON).
Game Learning Analytics
Game Learning Analytics
What data is to be
collected from the game
and how it relates to
learning goals
Which specific statements (e.g.
in xAPI format) are to be
tracked from the game
containing that information
How the statements collected
are to be analyzed and what
information is to be reported
and/or visualized
Results: RQ1 GLA purposes
➔ Main focus: assess learning &
predict performance
➔ Games are indeed useful for
purposes beyond entertainment
➔ Interest now in analyzing
interaction data to measure
impact on players and relation to
players’ in-game behaviors
Results: RQ2 data science techniques
➔ Linear models and cluster
techniques commonly applied
➔ Classical techniques
➔ More powerful techniques (e.g.
neural networks) not broadly
applied yet
➔ Need of xAI
➔ Explainable AI
➔ Human understable decissions
Results: RQ3 main stakeholders
➔ Purposes that cover interests of
many stakeholders
➔ Many research done on this area
➔ Students/Learners indirect
recipients of all results
Results: RQ4 conclusions and results
Results on assessment & student profiling:
➔ GLA data can accurately predict games’ impact
➔ Performance is related to players’ characteristics
Results on SG design:
➔ GLA data can validate SG design
➔ Assessment can & should be integrated in SG design
➔ Importance of SG characteristics
➔ Identified challenges when designing SG
➔ Proposed frameworks to simplify design
Results: Additional information
Serious games used:
➔ Main focus to teach
➔ Main domain maths and science-related topics
Participants in the validations studies:
➔ Small sample sizes used (<100)
➔ Primary & secondary education
Interaction data:
➔ Completion times, actions & scores commonly tracked
➔ Format not reported
Requirements to Systematize GLA in SG
• Applying GLA to serious games is complex, error-prone, and fragile
• Any small glitch can cause the whole process to fail
• GLA is still a complex process that is not affordable for most of the small
game producers or to game research teams
• Systematize GLA in SG require better models, standards and tools
• Game Learning Analytics Models
• Standard formats for collecting GLA data
• Tools that simplify the GLA implementation
• Authoring
• Tracking
• Analysis
• Orchestation / Management
GLA framework
Experience API for Serious Games: xAPI-SG Profile
Experience API (xAPI) is a new de facto standard that
enables the capture of data about human performance and
its context. Now it is becoming an IEEE standard
The e-UCM Research Group in collaboration with ADL
created the Experience API for Serious Games Profile (xAPI-
SG), a xAPI profile for the specific domain of Serious Games.
The xAPI-SG Profile defines a set of verbs, activity types and
extensions, that allows tracking of all in-game interactions
as xAPI traces (e.g. level started or completed) https://xapi.e-ucm.es/vocab/seriousgames
Ángel Serrano-Laguna, Iván Martínez-Ortiz, Jason Haag, Damon Regan, Andy Johnson, Baltasar Fernández-Manjón (2017):
Applying standards to systematize learning analytics in serious games. Computer Standards & Interfaces 50 (2017) 116–123,
xAPI-SG Profile
The xAPI-SG Profile is the result of the implementation of an interactions
model for Serious Games in xAPI.
The types of interactions that can be performed in a Serious Game, and are
included in the profile, can be grouped based on the type of interactions and
game objects that the interaction is performed over.
The following slides present some of these common interactions and game
objects related with them, with example xAPI-SG statements.
● completables
● accessibles
● alternatives
● GameObjects
xAPI-SG: Completables
A completable is something a player can start, progress and complete in a
game, maybe several times.
● Verbs: initialized, progressed, completed
● Types: game, session, level, quest, stage, combat, storynode, race, completable
John Smith progressed on Level 1 0.5
https://xapi.e-ucm.es/vocab/seriousgames
Java
xApi Tracker
Unity
xApi Tracker
C#
xApi Tracker
xAPI Game trackers as open code
https://github.com/e-ucm
SIMVA: SG Simple Validator
• Simva tool aims to simplify all the aspects of the validation
• Before the experiments:
• Managing users & surveys
• Providing anonymous identifiers to users
• During the experiments:
• Pre-questionnaire – Game analytics – Post-questionnaire
• Collecting and storing questionnaires (surveys) and traces data (xAPI-SG)
• Relating different data from users (GLA, questionnaires)
• After the experiments:
• Simplifying downloading and analysis of all data collected
Ivan Perez-Colado, Antonio Calvo-Morata, Cristina Alonso-Fernández, Manuel Freire, Iván Martínez-Ortiz, Baltasar Fernández-Manjón (2019): Simva:
Simplifying the scientific validation of serious games. 19th IEEE International Conference on Advanced Learning Technologies (ICALT), 15-18 July 2019,
Maceió-AL, Brazil.
Unified Cloud
Storage
Analytics and
Control
Survey
management
Data Science
tools
User & Group
management
Gameplay
Management
SIMVA orchestrates all the processes
T-mon: Monitoring xAPI-SG traces in Python
Experience API Profile
for Serious Games
(xAPI-SG)
T-mon (Traces
monitor)
xAPI-SG Processor
Jupyter Notebooks
Default set of
analysis and
visualizations
https://github.com/e-ucm/xapi-sg-processor
Default analysis and visualizations
Serious game completion
initialized and
completed traces
with object-type
serious-game
Serious game progress
initialized, progressed and
completed traces
with object-type
serious-game,
result.progress and
timestamp
Choices in alternatives
selected traces with
object-type alternative,
result.response and
result.success
Completable progress
progressed traces in any
completable object type,
with result.progress
Default analysis and visualizations
Interactions
interacted traces with
any object type;
bar chart per item, and
each bar per player
Completable results (scores)
completed trace of any
completable with result.score
Completable results (max and min times)
difference in timestamp
of initialized and
completed traces of
each completable
Interactions (heatmap)
interacted traces
grouped by item
(object)
and player
GLA Methodology
1. Game validation phase:
○ validate the serious game against actual results (post-test)
○ Collect all the game analytics data to improve the game
2. Use GLA interaction data to predict knowledge after playing.
○ create prediction models taking as input the interaction data
3. Game deployment phase:
○ students play and are automatically assessed based on their interactions (used
as input for prediction models)
○ pre-post are no longer required
Cristina Alonso-Fernández, Ana Rus Cano, Antonio Calvo-Morata, Manuel Freire, Iván Martínez-Ortiz, Baltasar Fernández-Manjón (2019): Lessons
learned applying learning analytics to assess serious games. Computers in Human Behavior, Volume 99, October 2019, Pages 301-309
The game: First Aid Game
Game to teach first aid techniques
to 12-16 years old players
Three initial situations:
● chest pain
● unconsciousness
● choking
Game previously validated with
pre-post and control group:
Video-game instruction in basic life support
maneuvers. Marchiori EJ, Ferrer G, Fernandez-Manjon
B, Povar Marco J, Suberviola Gonźalez JF, Gimenez
Valverde A. (2012)
Pre-post questionnaires + GLA data
N = 227 students from a high school in Madrid (Spain)
Each student completed:
● pre-test: 15 questions assessing previous knowledge
about first aid techniques
● gameplay: of First Aid Game
● post-test: 15 questions assessing knowledge about
first aid techniques after playing
Collection of both results in pre-post test and GLA
interaction data from the game (following xAPI-SG Profile).
Cristina Alonso-Fernández, Iván Martínez-Ortiz, Rafael Caballero, Manuel Freire, Baltasar Fernández-Manjón (2020): Predicting students’ knowledge after
playing a serious game based on learning analytics data: A case study. Journal of Computer Assisted Learning, vol. 36, no. 3, pp. 350-358, June 2020.
New uses of games based on GLA
- Avoiding pre-test: Games for evaluation
- Avoiding post-test: Games for teaching and measure of learning
With or without pre-test.
Cyberbullying: Conectado game
Serious Game → prevent Bullying and
Cyberbullying
● Increase awareness and empathy
● Youngsters (12 to 17 years old)
● Use at the school as a tool for teachers
Conectado:
● Point & Click game
● Player in the role of victim
● Choices and minigames that you can not win
● Free and open code video game
1300
12
Significant increase in the ciberbullying
awareness Wilcoxon paired test, p<0.001
5.72
6.38
Antonio Calvo-Morata, Dan-Cristian Rotaru, Cristina Alonso-Fernández, Manuel Freire, Iván Martínez-Ortiz, Baltasar Fernández-Manjón (2018):
Validation of a Cyberbullying Serious Game Using Game Analytics. IEEE Transactions on Learning Technologies (early access)
Our xAPI tool-based approach for GLA
uAdventure authoring
tool
● Narrative and GPS games
● Easy to use by non-experts
● GLA out-of-the-box
● Extensions for custom GLA
e-UCM xAPI tracker
● Compatible with xAPI for
serious games
● High-level interface to create
traces
● Local and online modes
Simva management tool
● Integrated with uA
● Automatically collects traces
in xAPI format (simple LRS)
● Manages experiments and
users
● Integrates GLA and pre-post
questionnaires
TxMon analysis tool
● Analyzes xAPI traces
● Default visualizations
● xAPI for SGs as LAM
● Can connect with Simva
This set of tools provide an
interoperable sandbox system for
supporting the xAPI GLA lifecycle:
● Simplifies SG creation
● Reduces technical knowledge for
setting up and collecting analytics
● Simplifies experiment management
● And provide default analytics so:
○ Students can see how they did
○ Teachers can monitor and
evaluate
○ Developers and teachers can
review and improve their games
Interoperable ecosystem for complete GLA lifecycle
Conclusions
• Game Learning Analytics has a great potential for improving SGs
• Evidence based serious games
• Games as assessments (better “Stealth” student evaluation)
• Games as powerful research environments
• Still complex to implement GLA in SG
• Increases the (already high) cost of the games
• Requires expertise not always present in game developers, SME or research
groups
• Real time GLA is still complex and fragile (e.g. deployment is schools)
• New standards specifications (e.g. xAPI) and open software tools could
greatly simplify GLA implementation and adoption
• Ethics should drive the GLA process
45
Thanks!
Contact: balta@fdi.ucm.es @baltafm
This work has been partially funded by Regional Government of Madrid (eMadrid S2018/TCS4307, co-funded by the European
Structural Funds FSE and FEDER), by the Ministry of Education (TIN2017-89238-R, PID2020-119620RB-I00), by MIT-La Caixa (MISTI
program, LCF/PR/MIT19/5184001) and by Telefonica-Complutense Chair on Digital Education and Serious Games.
Our publications: https://www.e-ucm.es/publications/all/
Our open code: https://github.com/e-ucm/

WEEF/GEDC eMadrid_Systematizing Game Learning Analytics for Improving Serious Games Lifecycle

  • 1.
    Systematizing Game LearningAnalytics for Improving Serious Games Lifecycle Baltasar Fernandez-Manjon balta@fdi.ucm.es , @BaltaFM e-UCM Research Group , www.e-ucm.es Educación Digital y Juegos Serios Cátedra Telefónica-Complutense
  • 2.
    Serious Games • Anyuse of digital games with purposes other than entertainment (Michael & Chen, 2006) • Applied successfully in many domains (medicine, business) with different purposes (knowledge, awareness, behavioural change) • But still is a low adoption of Serious Games in mainstream education • More evidence-support about SG efficacy is needed Clark Abt, 1970
  • 3.
  • 4.
    Do serious gamesactually work? - Very few SG have a formal evaluation (e.g., pre-post) - Usually tested with a very limited number of users - Formal evaluation could be as expensive as creating the game (or even more expensive) - Evaluation is not yet considered as a strong requirement - Difficult to deploy games in the classroom - Teachers have very little info about what is happening when a game is being used - what the student has learned from playing the game?
  • 5.
    Serious Games forbullying & cyberbullying A. Calvo-Morata, C. Alonso-Fernández, M. Freire-Morán, I. Martínez-Ortiz and B. Fernández-Manjón, Serious games to prevent and detect bullying and cyberbullying: a systematic serious games and literature review in Computers & Education, 8 July 2020
  • 6.
    Serious Games forbullying & cyberbullying • Only 14/32 games with user validation • Validation: average < 300 users per game • Very few collect user interactions • Various target groups • They address the problem in a variety of ways • 20 Bullying, 7 Cyberbullying, 5 Both problems, work on empathy, raise awareness, show prevention strategies, safe internet use, report harassment, identify problem, change behavior, develop emotional/social skills • No open games • Lack of free access to the game resources and its code A. Calvo-Morata, C. Alonso-Fernández, M. Freire-Morán, I. Martínez-Ortiz and B. Fernández-Manjón, Serious games to prevent and detect bullying and cyberbullying: a systematic serious games and literature review in Computers & Education, 8 July 2020
  • 7.
    Most common methodologyare pre-post questionaries in experiments: Formal validation of serious games Is there a significant difference between pre-questionary and post-questionary results? Pre and post questionaries should have been previously validated
  • 8.
    Learning analytics &Game Analytics • Learning analytics: Improving education based on analysis of actual data • Data driven • From only theory-driven to evidence-based • Game Analytics: Application of analytics to game development and research (Telemetry) • Game metrics • Interpretable measures of data related to games • Player behavior • Mainly used with “commercial purposes” • monetization, churn, user funnels
  • 9.
    Game Learning Analytics breakingthe game black box model to obtain information while students play. Manuel Freire, Ángel Serrano-Laguna, Borja Manero, Iván Martínez-Ortiz, Pablo Moreno-Ger, Baltasar Fernández-Manjón (2016): Game Learning Analytics: Learning Analytics for Serious Games. In Learning, Design, and Technology (pp. 1–29). Cham: Springer International Publishing. http://doi.org/10.1007/978-3-319-17727-4_21-1. •GLA is learning analytics applied to serious games • collect, analyze and visualize data from learners’ interactions with SGs Game Learning Analytics (GLA)
  • 10.
    Uses of GameLearning Analytics in Serious Games Lifecycle • Game testing – game analytics • Player focus – user experience • Average playing time, completion rate • Game deployment and student evaluation • Real-time information for supporting the teacher • Knowing what is happening when the game is deployed in the class • “Stealth” student evaluation (Valerie Shute) • Formal Game evaluation – game effectivity • From pre-post questionaries to evaluation based on game learning analytics?? GLA
  • 11.
    From Formal GameValidation to Game Learning Analytics PRE POST Experimental group Real-time analysis Off-line analysis User control Session control Game efficacy User acceptance Design validation
  • 12.
    Minimun Game Requirementsfor GLA • Most of games are black boxes. • No access to what is going on during game play • We need access to game “guts” • User interactions • Changes of the game state or game variables • Or the game must communicate with the outside world • Using some logging framework • What is the meaning of the that data? • Ethics: adequate experimental design and setting • Are users informed? • Anonymization of data could be required • Fair data exploitation for all stakeholders?
  • 13.
    Game Learning Analytics(GLA) or Informagic? • Informagic • False expectations of gaining full insight on the game educational experience based only on very shallow game interaction data • Set more realistic expectations about learning analytics with serious games • Requirements • Outcomes • Uses • Cost/Complexity Perez-Colado, I. J., Alonso-Fernández, C., Freire-Moran, M., Martinez-Ortiz, I., & Fernández-Manjón, B. (2018). Game Learning Analytics is not informagic! In IEEE Global Engineering Education Conference (EDUCON).
  • 14.
  • 15.
    Game Learning Analytics Whatdata is to be collected from the game and how it relates to learning goals Which specific statements (e.g. in xAPI format) are to be tracked from the game containing that information How the statements collected are to be analyzed and what information is to be reported and/or visualized
  • 17.
    Results: RQ1 GLApurposes ➔ Main focus: assess learning & predict performance ➔ Games are indeed useful for purposes beyond entertainment ➔ Interest now in analyzing interaction data to measure impact on players and relation to players’ in-game behaviors
  • 18.
    Results: RQ2 datascience techniques ➔ Linear models and cluster techniques commonly applied ➔ Classical techniques ➔ More powerful techniques (e.g. neural networks) not broadly applied yet ➔ Need of xAI ➔ Explainable AI ➔ Human understable decissions
  • 19.
    Results: RQ3 mainstakeholders ➔ Purposes that cover interests of many stakeholders ➔ Many research done on this area ➔ Students/Learners indirect recipients of all results
  • 20.
    Results: RQ4 conclusionsand results Results on assessment & student profiling: ➔ GLA data can accurately predict games’ impact ➔ Performance is related to players’ characteristics Results on SG design: ➔ GLA data can validate SG design ➔ Assessment can & should be integrated in SG design ➔ Importance of SG characteristics ➔ Identified challenges when designing SG ➔ Proposed frameworks to simplify design
  • 21.
    Results: Additional information Seriousgames used: ➔ Main focus to teach ➔ Main domain maths and science-related topics Participants in the validations studies: ➔ Small sample sizes used (<100) ➔ Primary & secondary education Interaction data: ➔ Completion times, actions & scores commonly tracked ➔ Format not reported
  • 22.
    Requirements to SystematizeGLA in SG • Applying GLA to serious games is complex, error-prone, and fragile • Any small glitch can cause the whole process to fail • GLA is still a complex process that is not affordable for most of the small game producers or to game research teams • Systematize GLA in SG require better models, standards and tools • Game Learning Analytics Models • Standard formats for collecting GLA data • Tools that simplify the GLA implementation • Authoring • Tracking • Analysis • Orchestation / Management
  • 23.
  • 24.
    Experience API forSerious Games: xAPI-SG Profile Experience API (xAPI) is a new de facto standard that enables the capture of data about human performance and its context. Now it is becoming an IEEE standard The e-UCM Research Group in collaboration with ADL created the Experience API for Serious Games Profile (xAPI- SG), a xAPI profile for the specific domain of Serious Games. The xAPI-SG Profile defines a set of verbs, activity types and extensions, that allows tracking of all in-game interactions as xAPI traces (e.g. level started or completed) https://xapi.e-ucm.es/vocab/seriousgames Ángel Serrano-Laguna, Iván Martínez-Ortiz, Jason Haag, Damon Regan, Andy Johnson, Baltasar Fernández-Manjón (2017): Applying standards to systematize learning analytics in serious games. Computer Standards & Interfaces 50 (2017) 116–123,
  • 25.
    xAPI-SG Profile The xAPI-SGProfile is the result of the implementation of an interactions model for Serious Games in xAPI. The types of interactions that can be performed in a Serious Game, and are included in the profile, can be grouped based on the type of interactions and game objects that the interaction is performed over. The following slides present some of these common interactions and game objects related with them, with example xAPI-SG statements. ● completables ● accessibles ● alternatives ● GameObjects
  • 26.
    xAPI-SG: Completables A completableis something a player can start, progress and complete in a game, maybe several times. ● Verbs: initialized, progressed, completed ● Types: game, session, level, quest, stage, combat, storynode, race, completable John Smith progressed on Level 1 0.5 https://xapi.e-ucm.es/vocab/seriousgames
  • 28.
    Java xApi Tracker Unity xApi Tracker C# xApiTracker xAPI Game trackers as open code https://github.com/e-ucm
  • 29.
    SIMVA: SG SimpleValidator • Simva tool aims to simplify all the aspects of the validation • Before the experiments: • Managing users & surveys • Providing anonymous identifiers to users • During the experiments: • Pre-questionnaire – Game analytics – Post-questionnaire • Collecting and storing questionnaires (surveys) and traces data (xAPI-SG) • Relating different data from users (GLA, questionnaires) • After the experiments: • Simplifying downloading and analysis of all data collected Ivan Perez-Colado, Antonio Calvo-Morata, Cristina Alonso-Fernández, Manuel Freire, Iván Martínez-Ortiz, Baltasar Fernández-Manjón (2019): Simva: Simplifying the scientific validation of serious games. 19th IEEE International Conference on Advanced Learning Technologies (ICALT), 15-18 July 2019, Maceió-AL, Brazil.
  • 30.
    Unified Cloud Storage Analytics and Control Survey management DataScience tools User & Group management Gameplay Management SIMVA orchestrates all the processes
  • 32.
    T-mon: Monitoring xAPI-SGtraces in Python Experience API Profile for Serious Games (xAPI-SG) T-mon (Traces monitor) xAPI-SG Processor Jupyter Notebooks Default set of analysis and visualizations https://github.com/e-ucm/xapi-sg-processor
  • 33.
    Default analysis andvisualizations Serious game completion initialized and completed traces with object-type serious-game Serious game progress initialized, progressed and completed traces with object-type serious-game, result.progress and timestamp Choices in alternatives selected traces with object-type alternative, result.response and result.success Completable progress progressed traces in any completable object type, with result.progress
  • 34.
    Default analysis andvisualizations Interactions interacted traces with any object type; bar chart per item, and each bar per player Completable results (scores) completed trace of any completable with result.score Completable results (max and min times) difference in timestamp of initialized and completed traces of each completable Interactions (heatmap) interacted traces grouped by item (object) and player
  • 35.
    GLA Methodology 1. Gamevalidation phase: ○ validate the serious game against actual results (post-test) ○ Collect all the game analytics data to improve the game 2. Use GLA interaction data to predict knowledge after playing. ○ create prediction models taking as input the interaction data 3. Game deployment phase: ○ students play and are automatically assessed based on their interactions (used as input for prediction models) ○ pre-post are no longer required Cristina Alonso-Fernández, Ana Rus Cano, Antonio Calvo-Morata, Manuel Freire, Iván Martínez-Ortiz, Baltasar Fernández-Manjón (2019): Lessons learned applying learning analytics to assess serious games. Computers in Human Behavior, Volume 99, October 2019, Pages 301-309
  • 36.
    The game: FirstAid Game Game to teach first aid techniques to 12-16 years old players Three initial situations: ● chest pain ● unconsciousness ● choking Game previously validated with pre-post and control group: Video-game instruction in basic life support maneuvers. Marchiori EJ, Ferrer G, Fernandez-Manjon B, Povar Marco J, Suberviola Gonźalez JF, Gimenez Valverde A. (2012)
  • 37.
    Pre-post questionnaires +GLA data N = 227 students from a high school in Madrid (Spain) Each student completed: ● pre-test: 15 questions assessing previous knowledge about first aid techniques ● gameplay: of First Aid Game ● post-test: 15 questions assessing knowledge about first aid techniques after playing Collection of both results in pre-post test and GLA interaction data from the game (following xAPI-SG Profile). Cristina Alonso-Fernández, Iván Martínez-Ortiz, Rafael Caballero, Manuel Freire, Baltasar Fernández-Manjón (2020): Predicting students’ knowledge after playing a serious game based on learning analytics data: A case study. Journal of Computer Assisted Learning, vol. 36, no. 3, pp. 350-358, June 2020.
  • 38.
    New uses ofgames based on GLA - Avoiding pre-test: Games for evaluation - Avoiding post-test: Games for teaching and measure of learning With or without pre-test.
  • 39.
  • 40.
    Serious Game →prevent Bullying and Cyberbullying ● Increase awareness and empathy ● Youngsters (12 to 17 years old) ● Use at the school as a tool for teachers Conectado: ● Point & Click game ● Player in the role of victim ● Choices and minigames that you can not win ● Free and open code video game
  • 41.
  • 42.
    Significant increase inthe ciberbullying awareness Wilcoxon paired test, p<0.001 5.72 6.38 Antonio Calvo-Morata, Dan-Cristian Rotaru, Cristina Alonso-Fernández, Manuel Freire, Iván Martínez-Ortiz, Baltasar Fernández-Manjón (2018): Validation of a Cyberbullying Serious Game Using Game Analytics. IEEE Transactions on Learning Technologies (early access)
  • 43.
    Our xAPI tool-basedapproach for GLA uAdventure authoring tool ● Narrative and GPS games ● Easy to use by non-experts ● GLA out-of-the-box ● Extensions for custom GLA e-UCM xAPI tracker ● Compatible with xAPI for serious games ● High-level interface to create traces ● Local and online modes Simva management tool ● Integrated with uA ● Automatically collects traces in xAPI format (simple LRS) ● Manages experiments and users ● Integrates GLA and pre-post questionnaires TxMon analysis tool ● Analyzes xAPI traces ● Default visualizations ● xAPI for SGs as LAM ● Can connect with Simva
  • 44.
    This set oftools provide an interoperable sandbox system for supporting the xAPI GLA lifecycle: ● Simplifies SG creation ● Reduces technical knowledge for setting up and collecting analytics ● Simplifies experiment management ● And provide default analytics so: ○ Students can see how they did ○ Teachers can monitor and evaluate ○ Developers and teachers can review and improve their games Interoperable ecosystem for complete GLA lifecycle
  • 45.
    Conclusions • Game LearningAnalytics has a great potential for improving SGs • Evidence based serious games • Games as assessments (better “Stealth” student evaluation) • Games as powerful research environments • Still complex to implement GLA in SG • Increases the (already high) cost of the games • Requires expertise not always present in game developers, SME or research groups • Real time GLA is still complex and fragile (e.g. deployment is schools) • New standards specifications (e.g. xAPI) and open software tools could greatly simplify GLA implementation and adoption • Ethics should drive the GLA process 45
  • 46.
    Thanks! Contact: balta@fdi.ucm.es @baltafm Thiswork has been partially funded by Regional Government of Madrid (eMadrid S2018/TCS4307, co-funded by the European Structural Funds FSE and FEDER), by the Ministry of Education (TIN2017-89238-R, PID2020-119620RB-I00), by MIT-La Caixa (MISTI program, LCF/PR/MIT19/5184001) and by Telefonica-Complutense Chair on Digital Education and Serious Games. Our publications: https://www.e-ucm.es/publications/all/ Our open code: https://github.com/e-ucm/