Mobile apps to support and
assess foreign language learning
Anke Berns
Manuel Palomo-Duarte
Juan-Manuel Dodero
Alberto Sánchez Cejas
Juan Miguel Ruiz-Ladrón
Andrea Calderón Márquez
EUROCALL 2015 Padova, 26-29 August 2015
1
2
1. TEACHING BACKGROUND
2. PURPOSE OF THE STUDY
3. CHALLENGES
4. DESIGN & ARCHITECTURE
5. CASE STUDY
6. RESULTS & CONCLUSIONS
7. FUTURE WORK
CONTENT
1. TEACHING BACKGROUND
3
•beginner level (A1.1; CEFR)
•very large language courses
•few hours of language practice in class versus many hours of
independent learning
•few opportunities to attend personal learner needs
•need for more personalised feedback
•need for more language input & output
2. PURPOSE OF THE STUDY:
4
•to explore the potential of smartphones to enhance language
learning out-of-class
•to make independent learning processes more dynamic &
easier to monitor
3. CHALLENGES
5
•to design an app that rather than delivering learning content
allows students to develop their own learning materials
•to develop a learning environment which is built by the
learners & teachers in line with their needs
•to help teachers monitoring & assessing students’ learning
process
6
We designed an APP, called Guess it! Language Trainer, that
aims:
– to support students in their independent learning by
addressing different language skills
– to get students actively involved in their learning process
– to help teachers monitoring & assessing students’ learning
process
4. DESIGN & ARCHITECTURE
7
GUESS IT! LANGUAGE TRAINER.
Students’ interaction with the app.
8
• The APP implements a simple game
based on guessing, assessing &
creating word
definitions/descriptions in the
target language.
• Students have to select first the
category, level & number of
definitions they want to play.
9
10
• Learners must rate every definition
they play (peer-assessment).
• Learners can report a definition if
they consider it wrong, offensive or
difficult to guess.
• Learners can listen to the
pronunciation of each definition.
• Learners can copy words to their
notebook.
11
• Receive constant feedback through different game statistics
(last game, rankings on their overall game-performance
etc.)
12
• Students are invited to create & enter their own definitions
• for every 20 words they guess, 1 new definition.
• new definitions are then played and rated (or reported) by mates.
Game-Dynamic
13
14
15
16
17
18
Teacher assessment
Definitions played per student & level
19
Success ratio per student & level
20
Ratings received per author
21
Ratings received per author
22
Definitions entered per student
23
5. CASE STUDY: SETTINGS
24
• German as a second foreign language course at the
University of Cadiz (Spain): 120 students played the APP
(A1.1 CEFR), 6 ECTS
The corpus of the system started with 282 definitions which
have been entered by the supervisor ...
… and ended up with 826 definitions!!!
More than 500 new definitions (+4 per each student)
Each game session provided students with new learning
content
- Students had to keep playing the APP
25
100 students filled in 4 tests (60 questions per test):
Pre-test before starting 1 to 4
Post-test 1 after a week playing levels 1 & 2
Post-test 2 after a week playing level 3
Post-test 3 after a week and a half playing levels 1 to 4
• Each post-test consisted of 80% of terms to be guessed
and 20% of words to be defined by the students
themselves
26
• Results in the tests (out of 10 points)
– Mean difference from pre- to posttest 1 was 4,8
– Mean difference from pre- to posttest 2 was 5,7
– Mean difference from pre- to posttest 3 was 5,35
• Students played a total number of 165.178 definitions
‒ Considering just 1 minute for each: reading, answering,
assessing, (reporting) and (copying), the results show
that students played more than 5 hours per week!
6. RESULTS & CONCLUSIONS
27
‐ the results obtained (server logs & pre- and
posttest) show that students played & improved
very much their language knowledge
‐ the supervisor could easily access valuable
information regarding learning analytics &
assessment
‐ the webportal helped to early identify those students
who had difficulties & needed more personalised
feedback
-the results from our interviews with the students
underlined the importance of giving them enough
guidance on how to make the most of the app for
their learning.
-the need to integrate students directly in the
development process.
-students enjoyed the app more in the second year
due to the extensive training sessions.
28
29
• Implementation of the
system to allow users to
improve reported
definitions.
• Multi-Player-Mode.
• Other plataforms: iOS,
Windows Phone
• Integration in LMS:
moodle, SNS, etc.
7. FUTURE WORK
30
QUESTIONS/ SUGGESTIONS?
Thank you for your attention!
If you -as a teacher or designer- have any
suggestions on how to improve and implement
the APP please contact us:
anke.berns@uca.es
manuel.palomo@uca.es
juanma.dodero@uca.es
31

PP Eurocall Conference 2015

  • 1.
    Mobile apps tosupport and assess foreign language learning Anke Berns Manuel Palomo-Duarte Juan-Manuel Dodero Alberto Sánchez Cejas Juan Miguel Ruiz-Ladrón Andrea Calderón Márquez EUROCALL 2015 Padova, 26-29 August 2015 1
  • 2.
    2 1. TEACHING BACKGROUND 2.PURPOSE OF THE STUDY 3. CHALLENGES 4. DESIGN & ARCHITECTURE 5. CASE STUDY 6. RESULTS & CONCLUSIONS 7. FUTURE WORK CONTENT
  • 3.
    1. TEACHING BACKGROUND 3 •beginnerlevel (A1.1; CEFR) •very large language courses •few hours of language practice in class versus many hours of independent learning •few opportunities to attend personal learner needs •need for more personalised feedback •need for more language input & output
  • 4.
    2. PURPOSE OFTHE STUDY: 4 •to explore the potential of smartphones to enhance language learning out-of-class •to make independent learning processes more dynamic & easier to monitor
  • 5.
    3. CHALLENGES 5 •to designan app that rather than delivering learning content allows students to develop their own learning materials •to develop a learning environment which is built by the learners & teachers in line with their needs •to help teachers monitoring & assessing students’ learning process
  • 6.
    6 We designed anAPP, called Guess it! Language Trainer, that aims: – to support students in their independent learning by addressing different language skills – to get students actively involved in their learning process – to help teachers monitoring & assessing students’ learning process 4. DESIGN & ARCHITECTURE
  • 7.
    7 GUESS IT! LANGUAGETRAINER. Students’ interaction with the app.
  • 8.
    8 • The APPimplements a simple game based on guessing, assessing & creating word definitions/descriptions in the target language. • Students have to select first the category, level & number of definitions they want to play.
  • 9.
  • 10.
    10 • Learners mustrate every definition they play (peer-assessment). • Learners can report a definition if they consider it wrong, offensive or difficult to guess. • Learners can listen to the pronunciation of each definition. • Learners can copy words to their notebook.
  • 11.
    11 • Receive constantfeedback through different game statistics (last game, rankings on their overall game-performance etc.)
  • 12.
    12 • Students areinvited to create & enter their own definitions • for every 20 words they guess, 1 new definition. • new definitions are then played and rated (or reported) by mates.
  • 13.
  • 14.
  • 15.
  • 16.
  • 17.
  • 18.
  • 19.
  • 20.
    Success ratio perstudent & level 20
  • 21.
  • 22.
  • 23.
  • 24.
    5. CASE STUDY:SETTINGS 24 • German as a second foreign language course at the University of Cadiz (Spain): 120 students played the APP (A1.1 CEFR), 6 ECTS The corpus of the system started with 282 definitions which have been entered by the supervisor ... … and ended up with 826 definitions!!! More than 500 new definitions (+4 per each student) Each game session provided students with new learning content - Students had to keep playing the APP
  • 25.
    25 100 students filledin 4 tests (60 questions per test): Pre-test before starting 1 to 4 Post-test 1 after a week playing levels 1 & 2 Post-test 2 after a week playing level 3 Post-test 3 after a week and a half playing levels 1 to 4 • Each post-test consisted of 80% of terms to be guessed and 20% of words to be defined by the students themselves
  • 26.
    26 • Results inthe tests (out of 10 points) – Mean difference from pre- to posttest 1 was 4,8 – Mean difference from pre- to posttest 2 was 5,7 – Mean difference from pre- to posttest 3 was 5,35 • Students played a total number of 165.178 definitions ‒ Considering just 1 minute for each: reading, answering, assessing, (reporting) and (copying), the results show that students played more than 5 hours per week!
  • 27.
    6. RESULTS &CONCLUSIONS 27 ‐ the results obtained (server logs & pre- and posttest) show that students played & improved very much their language knowledge ‐ the supervisor could easily access valuable information regarding learning analytics & assessment ‐ the webportal helped to early identify those students who had difficulties & needed more personalised feedback
  • 28.
    -the results fromour interviews with the students underlined the importance of giving them enough guidance on how to make the most of the app for their learning. -the need to integrate students directly in the development process. -students enjoyed the app more in the second year due to the extensive training sessions. 28
  • 29.
    29 • Implementation ofthe system to allow users to improve reported definitions. • Multi-Player-Mode. • Other plataforms: iOS, Windows Phone • Integration in LMS: moodle, SNS, etc. 7. FUTURE WORK
  • 30.
  • 31.
    If you -asa teacher or designer- have any suggestions on how to improve and implement the APP please contact us: anke.berns@uca.es manuel.palomo@uca.es juanma.dodero@uca.es 31

Editor's Notes

  • #30 Parches planeados para actualizaciones con futuras iteraciones. Docente podrá conocer los fallos de los alumnos con antelación.