This document summarizes a study that analyzed the types of feedback provided in five commercial early reading games. The study developed a content analysis framework to evaluate how the games addressed feedback related to where students are going, and how they are doing. The analysis found that the games predominantly provided outcome feedback on performance, while areas for improvement included setting learning objectives, success criteria, and incorporating more elaborative feedback designs. The study identified both best practices and open questions to guide future game design work in effectively supporting early reading skills through feedback.
Critical examination of feedback in early reading games
1. This project has received funding from the European Union’s Horizon 2020 research and innovation programme
under grant agreement No 731724.
A CRITICAL EXAMINATION OF FEEDBACK
IN EARLY READING GAMES
L A U R A B E N T O N , M I N A VA S A L O U , K AY B E R K L I N G ,
W O L M E T B A R E N D R E G T A N D M A N O L I S M AV R I K I S
2. IREAD CONCEPT
Reading game Reader app
Interactive
e-books
Meta-search
engine
Personalised learning (infrastructure, domain language models, computational
tools) and teacher tools
3. IREAD CONCEPT
Reading game Reader app
Interactive
e-books
Meta-search
engine
Personalised learning (infrastructure, domain language models, computational
tools) and teacher tools
4. FEEDBACK – WHAT AND WHY
Hattie, J. and
Timperley, H. 2007.
The power of
feedback. Review of
educational research,
77, 1: 81-112
Where am I
going?
How am I
doing?
Where to
next?
5. OUTCOME FEEDBACK
Johnson, C. I., Bailey, S. K. and Van Buskirk, W. L. 2017. Designing Effective Feedback Messages in Serious Games and Simulations:
A Research Review, P. Wouters and H. Van Oostendorp (eds.). Springer, 119-140.
How am
I doing?
Information about correctness, error location and performance measures
6. ELABORATIVE FEEDBACK
Moreno, R. 2004. Decreasing cognitive load for novice students: Effects of explanatory versus corrective feedback in discovery-based
multimedia. Instructional science, 32, 1: 99-113.
“Look there is a split
digraph that means
the ’i’ makes a long
vowel sound”
Task/topic information, corrective strategies, why (in)correct or hints/prompts
How am
I doing?
7. • Goal: Critical analysis of the dimensions of feedback that
early learning games for reading promote and exclude
• Method: five commercial reading games (comprising 35
mini-games) analysed
• Analytic tool: content analysis framework developed by
consolidating and iteratively testing previous frameworks
RESEARCH GOAL
9. • Literacy instruction
• Game play
mechanics
Teaching
concepts
• Learning Objective
• Success criteria
Where am
I going?
• Outcome
• Knowledge of
Results
• Knowledge of
Correct Result
• Try-Again
• Error Flagging
• Elaborative
• Topic Specific
• Response Specific
• Informational
• Hints, Prompts or
Cues
How am I
doing?
• Rewards
• Score System
• Experience Points
• Item Granting
• Resources
• Achievement
System
• Feedback
Messages
• Punishments
• Removal
• Loss
How am I
doing?
CONTENT ANALYSIS
FRAMEWORK
Hattie, J. and Timperley, H. 2007. The power of feedback. Review of educational research, 77, 1: 81-112.
Johnson, C. I., Bailey, S. K. and Van Buskirk, W. L. 2017. Designing Effective Feedback Messages in Serious Games and Simulations:
A Research Review, P. Wouters and H. Van Oostendorp (eds.). Springer, 119-140.
Shute, V. J. 2008. Focus on formative feedback. Review of educational research, 78, 1: 153-189.
Wang, H. and Sun, C.-T. 2011. Game Reward Systems: Gaming Experiences and Social Meanings. In The fifth international
conference of the Digital Research Association (DIGRA),
10. • Reading instruction
• Game play
mechanics
Teaching
concepts
• Learning Objective
• Success criteria
Where am
I going?
• Outcome
• Knowledge of
Results
• Knowledge of
Correct Result
• Try-Again
• Error Flagging
• Elaborative
• Topic Specific
• Response Specific
• Informational
• Hints, Prompts or
Cues
How am I
doing?
• Rewards
• Score System
• Experience Points
• Item Granting
• Resources
• Achievement
System
• Feedback
Messages
• Punishments
• Removal
• Loss
How am I
doing?
CONTENT ANALYSIS
FRAMEWORK
Hattie, J. and Timperley, H. 2007. The power of feedback. Review of educational research, 77, 1: 81-112.
Johnson, C. I., Bailey, S. K. and Van Buskirk, W. L. 2017. Designing Effective Feedback Messages in Serious Games and Simulations:
A Research Review, P. Wouters and H. Van Oostendorp (eds.). Springer, 119-140.
Shute, V. J. 2008. Focus on formative feedback. Review of educational research, 78, 1: 153-189.
Wang, H. and Sun, C.-T. 2011. Game Reward Systems: Gaming Experiences and Social Meanings. In The fifth international
conference of the Digital Research Association (DIGRA),
11. • Reading instruction
• Game play
mechanics
Teaching
concepts
• Learning Objective
• Success criteria
Where am
I going?
• Outcome
• Knowledge of
Results
• Knowledge of
Correct Result
• Try-Again
• Error Flagging
• Elaborative
• Topic Specific
• Response Specific
• Informational
• Hints, Prompts or
Cues
How am I
doing?
• Rewards
• Score System
• Experience Points
• Item Granting
• Resources
• Achievement
System
• Feedback
Messages
• Punishments
• Removal
• Loss
How am I
doing?
CONTENT ANALYSIS
FRAMEWORK
Hattie, J. and Timperley, H. 2007. The power of feedback. Review of educational research, 77, 1: 81-112.
Johnson, C. I., Bailey, S. K. and Van Buskirk, W. L. 2017. Designing Effective Feedback Messages in Serious Games and Simulations:
A Research Review, P. Wouters and H. Van Oostendorp (eds.). Springer, 119-140.
Shute, V. J. 2008. Focus on formative feedback. Review of educational research, 78, 1: 153-189.
Wang, H. and Sun, C.-T. 2011. Game Reward Systems: Gaming Experiences and Social Meanings. In The fifth international
conference of the Digital Research Association (DIGRA),
19. Prevalent practice to provide outcome feedback
• Good practice
• learning goals; criteria setting; reading instruction
• Areas for research
• game play mechanics; try again; elaborative feedback
• Contributions
• Methodological tool
• Strengths/weaknesses of existing reading games
• Open questions for future design work
CONCLUSION
20. • What kind of
strategies does trial
and error promote in
children?
• Design specification
documents that
generalises to our
games
FUTURE WORK
15/06/2018 20
21. http://iread-project.eu iRead Project @iread_project
This project has received funding from the European Union’s Horizon 2020 research and innovation programme
under grant agreement No 731724. 21
Thank you for your attention!
Editor's Notes
Our paper focuses on the design of feedback within a number of commercially available games aimed teaching reading skills to young children.
CHI
Background – we undertook this study as part of EU H2020 project called iRead which is developing personalised learning technologies to support children’s reading. As part of this work we are developing a number of innovations, which include a reading game.
As a first step in designing this game we decided to examine reading games that are currently use in schools and we were particularly interested in the inclusion of feedback within these games.
Feedback has been defined as information given by an agent (human or digital) to inform a learner about their performance and understanding
In games: about enabling connections between the gameplay and the initial instructional objectives to help inform and guide the learner about their next step
Why should we and designers care - Research has recognised that feedback plays a powerful role in raising achievement
Hattie – previous work looking at almost 200 studies considering influences on student achievement found feedback to have one of the highest effects
Hattie and Timperley state that effective feedback must answer 3 key questions Where am I going? This requires clear goals and success criteria to be defined.
How am I doing? Identifying and communicating of the learner’s progress towards these goals
Where to next? Guidance and scaffolds to enable the learner to know what to do in the future to make better progress.
Drilling down to the middle bubble - how am I doing? [understand performance]
Two types of feedback that have been defined in the literature (Johnson et al)
Also known as verification feedback…
This could be as simple as you’re right or a tick, or more abstract in this case where the answers are represented by cakes in the fridge.
[ to help you improve]
Elaborative feedback includes information about the specific task or topic, corrective strategies, why an answer is correct/incorrect or hints/prompts for the correct response
Game: the child’s task is to build a word they have heard. As the child chooses a letter (group of), the sound is spoken back to them and the target word repeated.
--
theoretical and serious game specific frameworks of feedback
five commerical early games – selected 35 mini games which sampled a range of game types and learning content
Mini-game was our unit of analysis
Identified in a interviews with 8 primary teachers from range of schools.
All of the selected games were designed to teach early reading skills to children aged 5 to 7 years, or to teach older children who are still struggling with acquiring these early reading skills.
Online and two were also available as apps
The games covered the following key reading skills: phonics, vocabulary, fluency and comprehension.
This original iteration of the framework came from existing feedback frameworks
(From the work of Hattie and Timperley) Learning Objective and Success Criteria - Are these made explicit?
(From the work of Johnson et al) Different types of outcome and elaborative feedback
In a further iteration we added some additional components to the framework
-Hattie and Timperley make the point that feedback is only powerful only when it builds on prior instruction so we Added in Teaching Concepts – teach both reading concept and game mechanics
- Rewards for successes are a core component of games, coming in many different forms and give information about the correctness of a response - therefore we extended our framework to also examine their inclusion within games using an existing classification defined by Wang and Sun
In the interests of time here I will focus on first 3 – rewards covered in the paper
Nessy game
scores (sometimes more implicit i.e. via sound effect, animations), error flagging
Through our analysis we identified a number of good practices in terms of the feedback design
Most games posed a clear goal which were introduced by referring to the learning objective contextualised in the game mechanics (e.g. ‘put all in the sheep in the /s/ pen or in the example here split the word into syllables’).
Criteria for success was often made clear to the child for instance displayed on screen through a quantified target (e.g. a set number of stars that needed to be acquired like in this Busy Things game).
Many games firstly introduced and taught the key reading concept addressed in the game
For example in Education city instruction was provided prior to all mini-games and followed the same multi-sensory approach recommended in the classroom. They introduce split digraphs visually through animated text and colour coding, the instructions are also read aloud verbally and the reading of split digraphs is modelled through the provision of multiple example words.
There were also a number of gaps in the feedback design that we identified
The majority of the games reflected less effort in supporting learning of the game mechanics, with the exception of the Nessy example here
Typically in games the player develops an understanding of the game play through experiencing failures at various points in the game and then trying again.
In learning games it is difficult to separate failure due to the game mechanic or due to a gap in understanding the learning content.
Can skip the tutotial
Young target age group need opportunities to become familiar with the broader game play even if the mechanics are relatively simple prior to focusing on the learning content.
Could link to whether children are practicing familiar content or learning new content – different cognitive loads
[To consider – should learning be part of the game play. Teachers don’t value this kind of learning]
An incorrect response was often communicated implicitly by asking a child to try again, indicating that their previous attempt was not correct.
Most of the games allowed a child to repeatedly make the same mistake - not including the correct answer to support the child to understand their error - encouraging a trial and error approach. For example in the TYMTR game (video of two children playing) also didn’t explicitly communicate the correct answer.
One exception and exemplar of good practice on both trial and error and elaborative feedback was Nessy: upon an error it explained the correct answer immediately giving the child a chance to apply this knowledge in the same context, but with new content
Elaborative feedback was more limited particularly in helping children recovering from errors. The games we analysed presented some topic specific feedback, such as this education city mini-game focused on reading comprehension and only one game presented response specific feedback (for explaining why a response what correct).
Open questions – including how to provide learning support beyond the content domain to also teach the mechanics of the game activity
also how to supporting deep learning through appropriate elaborative feedback that allows learners to understand and correct their error
Future work
Trial and error leads to an impass – working out the game mechanic overtakes the learning content
Design specification – feedback right (highlight the topic, read aloud) versus feedback wrong (one trial and error, then hint, then correct answer).
We hope that our work will shine a spotlight on the importance of well-designed feedback for early learning games,
We are now working on using these findings to inform the design of feedback in our game. To get the latest updates about the work of our project
We also thank the EU H2020 programme for providing the funding for this work and the teachers from Stormont, Holy Cross and Whitchuch primary schools
Embedding in iRead game, following examples of good practice e.g. teaching instruction. Success criteria and also providing additional elaborative feedback and ensuring the logic discourages a trial and error approach, provides the correct answer as well as ensuring generally new content and new mechanics are not introduced simultaneously