Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
The Application of Game-Like
Learning Design to Real-World
Settings: a Holistic Approach to
Scaling Project-Oriented Probl...
• The TH Köln is the largest institution of its kind in Germany with more than 24,000 students and 1200
academic staff.
• ...
1. Percentage of achievement variance among newly enrolled students
2. Rapid increase in new enrolments (abolition of comp...
• In recognition of these four key challenges for higher education in Germany, the TH Köln obtained funding
from the Germa...
Course Structure
5
Peer tutoring by cross-age tutors assist students in practicing self-regulation. Academic teaching staf...
• In order to move away from testing as an ‘end-of-course’ measure towards testing for learning, assessment
has been refra...
Feedback Levels Feedback Questions Assessment/ Feedback
Task or Product
Where am I going?
Information focused feedback (e....
Feedback Levels Feedback Questions Assessment/ Feedback
Task or Product
Where am I going?
Information focused feedback (e....
Categories of metacognitive knowledge Pilot N Mean SD SEM T
p
(2side)
Strategy variables:
I learnt how to strategically ap...
• Despite the substantial changes made in the course design and assessment procedures from Pilot 1
to Pilot 2, the mean an...
Upcoming SlideShare
Loading in …5
×

The Application of Game-Like Learning Design to Real-World Settings: a Holistic Approach to Scaling Project-Oriented Problem-Based Learning Faculty-Wide

370 views

Published on

An overview of the paper presented at ISAGA 2016.

Published in: Education
  • Be the first to comment

  • Be the first to like this

The Application of Game-Like Learning Design to Real-World Settings: a Holistic Approach to Scaling Project-Oriented Problem-Based Learning Faculty-Wide

  1. 1. The Application of Game-Like Learning Design to Real-World Settings: a Holistic Approach to Scaling Project-Oriented Problem- Based Learning Faculty-Wide Benita Rowe, Stefanie Gruttauer, Arno Bitzer, Siegfried Stumpf, Gabriele Koeppe
  2. 2. • The TH Köln is the largest institution of its kind in Germany with more than 24,000 students and 1200 academic staff. • The Faculty of Computer Science and Engineering receives an average of 390 new enrolments per semester to bachelor degree programs such as mechanical engineering, electrical engineering and engineering management (December 2015). • The students come from a diverse range of economic and cultural backgrounds and have an increasingly broad range of abilities upon commencement of their studies, specifically in terms of academic preparedness for tertiary education and mastery of the German language. Background Information 2
  3. 3. 1. Percentage of achievement variance among newly enrolled students 2. Rapid increase in new enrolments (abolition of compulsory military service and shortened secondary school duration) 3. Rising dropout rate 4. Decline in academic literacy levels - the mean literacy score in Germany is highest among adults who have attained tertiary education, however “55% of these adults perform at or below Level 1, compared with 39% on average among participating [OECD] countries” (OECD, 2014, para. 17) Four Key Challenges for Higher Education in Germany 3
  4. 4. • In recognition of these four key challenges for higher education in Germany, the TH Köln obtained funding from the German Ministry of Education (BMBF) in May 2011 to develop a cross-disciplinary curriculum with the aim of scaling project-oriented problem-based learning faculty-wide. • The course, centred on game-like learning delivered through a series of ‘side quests and errands’, or scaffolded tasks, is supported by a responsive assessment feedback loop which guides the students in learning how they can influence their assessment results with their own behaviours while competing with other groups in 'multiplayer mode'. The students, or 'players', are required to complete the side quests and errands to progress through the course, or 'real-time strategy game', which is designed to simulate a professional setting. Applying Game-Like Learning to Course Design 4
  5. 5. Course Structure 5 Peer tutoring by cross-age tutors assist students in practicing self-regulation. Academic teaching staff facilitate, evaluate and monitor the progress of the students throughout the course. Milestone 2 (students & tutors) Students: Five- week lecture series & side errand Supervision 1: Staff and senior student tutors discuss the progress of the students Team deliverables submitted Supervision 2: Staff and senior student tutors discuss the progress of the students Supervision 3: Staff and senior student tutors discuss the progress of the students Jury presentation, film premiere & awards ceremony Student tutors: selected and trained Formative online questionnaire (to measure team progress) & written feedback for each group Summative Milestone II presentation (meeting with profs & tutors) & written feedback for each group Students complete a series of scaffolded side quests and errands in multiplayer mode to progress through the course, or 'real-time strategy game' which is designed to simulate a professional setting. Milestone 1 (students & tutors) Milestone 3 (students & tutors) Level 1 Level 2 Kick-offEvent(Level2)
  6. 6. • In order to move away from testing as an ‘end-of-course’ measure towards testing for learning, assessment has been reframed as a responsive assessment feedback loop. Designed to mimic the functionality of feedback loops in game design, responsive assessment mechanisms have been integrated into the course with the aim of creating a learning environment anchored on multiple sources of summative and formative feedback. • Hattie and Timperley’s (2007) model of feedback is used to guide the professors, the academic research assistants, the senior student tutors and the students throughout this process. • The model (Hattie, 2011, p. 3) comprises of four levels of feedback, three of which we have integrated into our feedback loop (task or product, processes, self-regulation) and three feedback questions designed to encourage the development of metacognitive knowledge and self-regulation: Where am I going? How am I going? Where to next? Responsive Assessment Feedback Loop 6
  7. 7. Feedback Levels Feedback Questions Assessment/ Feedback Task or Product Where am I going? Information focused feedback (e.g. correct or incorrect) Online formative quests (mobile-optimised) and a summative multiple-choice ‘side errand’ Processes How am I going? Targets processes used to create the product or complete a task Formative structured in-lecture questioning using game-based learning platforms and the students’ mobile devices; instructional scaffolding in the form of print and multimedia resources Self - Regulation Where to next? The students’ monitoring of their learning processes Social networking technologies are leveraged for 'rolling' formative feedback and discussion Responsive Assessment Feedback Loop – Phase 1 7
  8. 8. Feedback Levels Feedback Questions Assessment/ Feedback Task or Product Where am I going? Information focused feedback (e.g. correct or incorrect) Scaffolded summative quests and errands (Milestone II presentation, milestone documents, film, article) completed under the guidance of senior student tutors while competing with other groups in multiplayer mode Processes How am I going? Targets processes used to create the product or complete a task A formative online questionnaire is used to measure both the team climate and progress mid-way through Phase 2 Self - Regulation Where to next? The students’ monitoring of their learning processes Social networking technologies are leveraged for 'rolling' formative feedback and discussion throughout Phase 2 Responsive Assessment Feedback Loop – Phase 2 8
  9. 9. Categories of metacognitive knowledge Pilot N Mean SD SEM T p (2side) Strategy variables: I learnt how to strategically approach the given tasks. 1 285 2.40 1.011 .060 .614 .540 2 394 2.35 1.031 .052 Task variables: I learnt how to independently estimate how long I need to complete the given tasks and what I need to do to complete the given tasks. 1 272 2.25 .978 .059 -1.300 .670 2 395 2.35 1.048 .053 Person variables: I learnt how to better estimate how I learn and what to do if I don’t understand something. 1 262 2.78 1.063 .066 1.769 .077 a 2 390 2.62 1.154 .058 Likert scale: 1 = Strongly Agree, 2 = Agree, 3 = Neutral, 4 = Disagree, 5 = Strongly Disagree; Pilot 1 = Winter Semester 2013/14 - Summer Semester 2014; Pilot 2 = Winter Semester 2014/15 - Summer Semester 2015; a=absence of homoscedasticity, df adapted, p > 0,05 Course Evaluation 9
  10. 10. • Despite the substantial changes made in the course design and assessment procedures from Pilot 1 to Pilot 2, the mean answers to each response changed by less than 0.2. Interestingly, the task variable was evaluated non-significantly more negative changing from 2.25 in Pilot 1 to 2.35 in Pilot 2. • This is surprising, as the strategies that feature strongly in the course design for Pilot 2 (peer tutoring from cross-age tutors, formative assessment, feedback, challenge and practice at the right level and valuing error and creating trust) have all been found to have a high impact on learning (Hattie, 2009, pp. 297-298). • Previous studies have shown that project-oriented learning at tertiary level can have a negative effect on learning if implemented before students have obtained sufficient surface knowledge (Hattie, 2015, p. 85). This could explain why the students' perception of their achievement did not improve markedly between Pilot 1 (non-specific Phase 2 quest topic) and Pilot 2 (subject specific Phase 2 quest topic). Although challenge and feedback were included in the course design, the level of difficulty (subject specific quest topic) was not appropriate for the level of surface knowledge possessed by the first semester students. Results and Practical Implications 10

×