Paulette Robinson, Office of Personnel Management (OPM)
The policy courses for government leaders offered by OPM in its open enrollment catalog have been redesigned using a blended model that includes microlearning and gamification.
In this session, Dr. Robinson will report on the development process and the effectiveness of the new designed based on data collected.
Link to video found in slide deck: https://youtu.be/7KmRFrrYk9Y
3. Pilot Policy Course Redesign Process
• Needs Assessment
• Instructional Design
• Blended Instructional Design
• Selection of a Gamification/Microlearning Platform
• Content Development Team
• Evaluation and Data Results from Courses
• Lessons Learned
• Recommendations Going Forward
8. Blended Instructional Design Layout
The Online Portion (4 Weeks) Residential Portion
Pre-Assessment Consistent topical areas and
segments with clear segment
objectives and overall learning
outcomes
Review of online materials
Micro-learning/gamification
(available on smart phones,
tablets and computers)
Activity to initiate
networking and simulation
line folder 1: diagrams, job aides,
and short summaries
No more than 3 expert
speakers
Online folder 2: Annotated
bibliography of additional
resources
Conduct the simulation
Online folder 3: Simulation Process simulation activity
Post Assessment
19. Pre- and Post Assessment
Pre Assessment Post Assessment
Average Score 15 18
High Score 21 Perfect Score (2)
Low Score 8 15
30 questions in assessment/n=26
28. Student Survey After the Game
Technology Used to Access FBPP Game
Students could choose multiple responses.
Answer Choices Responses
Number of Responses Percentage
Government Smart Phones 3 10%
Government Tablet 1 3.33%
Government Laptop or Desktop 26 83.33%
Personal Smart Phone 7 23.33%
Personal Tablet 3 10%
Personal Laptop or Desktop 15 50%
Total Respondents (30)
29. Student Survey After the Game
Self-Report Student Content Retention
Answer Choice Responses
# Responses Percentage
I remember 100% of the material presented in the game 1 3.3%
I remember 75% of the material presented in the game 11 36.67%
I remember 50% of the material presented in the game 12 40%
I remember 25% of the material presented in the game 6 20%
I don’t remember any of the material presented in the game 0 0%
Total 30
30. Student Survey After the Game
Engagement of Game Content
Answer Choice Responses
# Responses Percentage
I found the game very engaging 8 26.67%
I found the game somewhat engaging 7 23.33%
I found the game was OK. 12 40%
I fount the game somewhat boring 1 3.3%
I found the game boring and uninteresting 2 6.67%
Total 30
31. Student Survey After the Game
Technical Issue (Comments)
Most Common Responses:
1. Issues in getting into game from government devices (3 responses)
2. Distorted display of the game on Internet Explorer
3. Performance of the GamEffective Platform
4. EMDC produced Jeopardy Games
Student Proposed Changes (Comments)
1. Overall positive
2. Couple of students did not like gamification format
3. Few comments suggesting the game be shortened
32. FBPP End-of Course Evaluation—Kirkpatrick Level 1
1-5 Likert Scale Rating
Satisfaction Reaction (Level 1)
Pilot 2016 Pilot 2016
Recommend to colleagues 4.12 3.86
Worth my time 4.28 3.86
Engagement 4.36 4.12
Program Relevance 3.68 3.71
Faculty 4.56 4.43
Course Materials 3.08 3.14
Activities 4.28 3.57
Environment 4.28 4.29
Overall Level 1 4.04 3.88
2016 n=7 out of 30
Pilot n=25 out of 31
33. FBPP End-of Course Evaluation—Kirkpatrick Level 2
Criteria Likert Rating Scale 1-5
Pilot 2016
Attitude: Commitment to public service 3.72 4.29
Confidence: Apply to work 3.92 4.00
Commitment: Committed to apply to work 4.20 4.19
Overall Level 2 3.95 4.19
2016 n=7 out of 30
Pilot n=25 out of 31
34. FBPP End-of Course Evaluation—Kirkpatrick Level 3 & 4
Criteria Likert Rating Scale 1-5
Pilot 2016
Training will improve work behaviors (level 3) 3.36 4.00
Program will improve job performance (level 4) 3.56 4.00
2016 n=7 out of 30
Pilot n=25 out of 31
35. FBPP End-of Course Evaluation
Likert Rating Scale 1-5
Faculty Simulation
Criteria Obannon Hezir Haun Gibbons Jensen
Session 1 Presido
Overall Instructor
enhanced my learning 4.68 4.68 4.08 4.52 4.42 4.67 4.52
Instructors
knowledge/background
enhanced my learning 4.75 4.75 4.12 4.65 4.42 4.71 4.56
Clearly conveyed
program information 4.56 4.68 4.08 4.44 4.43 4.75 4.52
2016 n=7 out of 30
Pilot n=25 out of 31
Editor's Notes
Office of Personnel Management
Human Resource Services (HRS)
Center for Leadership Development (CLD)—There are a number of organizations under the Center for Leadership Development. The Federal Executive Institute in Charlottesville, VA is probably the most well know.
I am part of the Eastern Management Development Center (EMDC)– Offers Open Enrollment Leadership Courses for federal government supervisors, managers and executives. EMDC offers LEAD certifications at each of these levels.
I work as a Program Director for the Policy Courses that are offered through EMDC.
Students for this the FBPP course (the first course in this policy course design) is for managers and executives. GS-14-15 and SES).
Class size is 30 students.
The landscape of the modern learner is one of the drivers to design and pilot blended learning options.
Recommendations
Reduce time and cost of the policy courses.
Consider blended course options that engage the students online in the basic knowledge of the course and prepare them for practical activities that apply knowledge in the classroom.
Incorporate a policy framework into every policy course that includes creating, analysis, interpretation, implementation, reporting and evaluation in some form.
The historical context of policy including its evolution and how to analyze it needs to part of each course.
Effective policy communication needs to be one of the learning outcomes.
Key leadership skills identified for the LEAD certificate are incorporated across the policy courses.
Ensure that faculty are selected for their expertise (conceptual and practical), in a timely fashion, and prepared to engage the students in the topic.
Create a course follow-up mechanism where students can reconnect and report how their learning in the course has helped them on the job.
Choose learning activities in the classroom portions of the courses that engage the students in practical experiences and facilitate student networking.
Facilitate student networking beyond the course.
Consider creating courses in Interagency Policy and Strategic Policy Making (future oriented).
Partnership
OPM – funding, program direction, instructional design, logistics, and students
EOP through CatMedia –developed the content for microlearnng/gamification and delivered the residential portion
GamEffective – provided the gamification/microlearning platform, shared best practices, and provided technical support
Demographics of the course
Decrease the resident portion of the course from 2 weeks to 1 week
Decrease the overall cost
Blended course is the best approach to the policy courses and overall learning
Information delivery in a micro-learning/ gamification format will increase learning background information.
Increase student engagement with the material
Re-useability of online materials
Residence course focuses on networking and problem-solving
Created pre- post assessment focused on the game portion of the course.
Online Portion of the Course
Created online materials for gamification/microlearning
Created digital resources to supplement both online and residential course (short concise text or media)
Created an annotated bibliography for students who wanted to explore the topic in more depth.
Tested the game
Residential Portion of the Course
Created a simulation and all support materials
Scheduled expert speakers to reinforce materials and inform simulation
Scheduled small group facilitators for simulation functions
Provided questions for game review jeopardy
Supported students during the simulation
Provided feedback on the pilot
Functional Requirements:
Describe the portions of the student course game home screen.
Types of Data Collected – Evaluation Plan
Interpretation of Data
The game portion of the course is required for course completion.
--Out of 31 students who attended the residential FBPP course, 29 completed the gamification/microlearning component of the course.
---GamEfffective offers data on individuals as well as aggregate data for the entire course. The two most helpful aggregates are shown in this slide.
--These 29 students spent 344 hours in the game with an average of 12 hours per student. Students can repeat their trivia game attempts to learn the content. The total number of trivia game attempts are 8515. The average number of trivia attempts per student is 294.
The table shows that the students engaged in the game over the four weeks. It also shows that over half of the students engaged in and completed the last four campaigns of the game the last week of the game. The game was required for completion of the course. Emails were sent to students reminding them of this requirement. The game closed the morning the residential portion of the course began. Only two students of the 31 students registered for the course did not complete the course. These two students were added to the course during the fourth week of the game.
Table 2:
It was interesting that students accessed the game the most on Wednesday through Friday during the day. Students were able to play the game at work as part of the duty hours.
More detailed Tables are in the Appendix.
1. Technology Access—we were curious how the students were able to access the game. They were given the opportunity to choose more than one response. While there was a mixture of government and personal devices used to access the game, 83% of the students accessed the game on their government computers.
2. Self-report on information retention of materials from the game. 80% reported 50% or more. 40% reported retaining 75% or more.
Engagement with the material. 50% of the students reported they were very engaged or somewhat engaged in the material in the game. Only 10 % felt the game was somewhat boring or boring and uninteresting.
Comments—Technical Issues
Most Common Responses:
Issues in getting into game from government devices (3 responses)
Distorted display of the game on Internet Explorer
Performance of the GamEffective Platform
EMDC produced Jeopardy Games
Comments—Student Proposed Changes
-- Overall positive
-- Couple of students did not like gamification format
-- Few comments suggesting the game be shortened
26 out of 31 students met the criteria of completing the game and both the pre- and post-assessments.
30 questions—focused on the game
80% of the students improved their scores on the post-assessment by an average of 4 points.
One student improved by 12 points.
Lack of context for the game (main complaint)
Laws and regulations section was too long and not as important as the budget process.
Final jeopardy game the week before the residential course—help with simulation.
Students remarked that, “We got it right.”
Approach to the content is very useful.
Mechanics of the Game
Pre-assessment results be linked to the game.
Metro Map was not clear and not of motivational.
Clearer connection between the game and the digital resources in LMS.
A clearer indication the game was completed.
One student played the game exclusively off his phone. Microlearning organization made the content easy to access and complete on his smart phone.
Overall, the students thought the course was well done. In the residential portion of the course, students made comments about the content.
--They suggested that “injects” be added daily to the simulation.
--They thought a morning daily news or situation brief with updates and deliverables would be very helpful.
--The students appreciated the simulation covered the entire budget process well enough that they could go back to their agencies and share it.
--The students thought the speakers were excellent. They told real-life stories to illustrate the budget process.
--Michael O’Bannon did an “awesome” job setting up the context and moderating the entire residential course.
In terms of the mechanics of the simulation
--the students wanted a clearer definition of their roles and deliverables.
--role sheet should be design that was designed for each role. While this was available in the overall simulation materials, it was not obvious to them.
--Students also suggested that their role be noted on their badges to make identification easy of those in their integrated and functional groups.
--Students thought that the simulation was logically divided.
--They thought there was a good mix between speakers and the simulation.
--The mentors in each group for the simulation were very helpful and available to students throughout the simulation.
In terms of the course administration and residential space
--students remarked they would prefer not paying in advance for meals and lodging. Breakfast was served at the hotel. Lunch was convenient at FEI. They preferred to make their own choices for dinner. Fewer and fewer classmates were attending dinner at FEI.
--No printers were available for students in Gwin Hall because there as a class in the converted library. Students were standing in line trying to get print in another building. A printer that can be accessed with a thumb drive or with internet access in the classroom would help.
Comments Summary
Comments were solicited for each of the speakers in the course as well as the simulation. The comments were very positive.
--The simulation was the main instructional activity in the residential portion of the course. The following comment is a good example of the other comments. “Excellent presentations on different parts of the budge process. A good mix of different positions and rolls, which was great. Well pulled together by Mr. O'Bannon that helped make sense of it all. Well organized course. Would have preferred a summary sheet of the major points from the game part of the course, as just reading through that a couple of messages would have helped me tremendously at all stages of the course.”
There were three questions that solicited student comments.
--The first question requested the factors that contributed to their level of engagement in the class. The simulation was listed as the most common factor. Additional factors listed as engaging included the game, the speakers and their openness to answer questions with particular mention of Michael O’Bannon, and the course material.
--The second question asked students to comment on what detracted from their level of engagement. Of the 14 comments, disorganization and understanding their role in the simulation was the factor mentioned the most. A couple of comments were made on the layout of the game, lack of a clear overview of the game and simulation, and the unwillingness of some students to engage in the simulation.
--The third question asked for examples from the course that reinforced their commitment to public service. The responses to this question focused on their awareness of the budget process and how to interact with it more effectively in their job.
The fourth question asked students to provide specific outcomes that they hoped to achieve as a result of the course. The responses were varied. Some of the students appreciated understanding the politics around the budget process and how to use it to be successful in their agency budget process. Other students stated their understanding of the budget processes to better manage the budget in the organization.
The fifth question asked students to provide any other comments on Customer Satisfaction.
--Most of the comments showed customer satisfaction with the course. An example of these comments is, “The information in this course was excellent, it provided me a strategic viewpoint that I never had prior.” There was also a few comments that gave good suggestions for improvement. For example, the following comment was balanced and gave helpful feedback.
“I applaud the effort that the staff put into the pilot course. I agree with the specific suggestions in Friday's debrief, including better organization of the online content, better organization of the handouts and website references, and a clear timeline of requirements (pre‐reading and role‐based products during the simulation). I would also add content for program managers several tiers below the Agency CFO. This could be accommodated by replacing the House floor debate portion of the simulation, which is in my opinion more focused on political actors than career government professionals.”
Overall
The blended course design was successful for the pilot. Overall students liked the design. They appreciated the residential portion of the course being shorter.
Expand the pre- and post assessment to include more of the simulation. We had focused on the material in the game.
Most of the students used their government computer/laptop to access the game. Only one student couldn’t get access to the game on his DOD computer. He used his phone for the entire game. Internet Explorer security settings effected the display of the game. Game was optimized for Chrome and Firefox.
Overall, students were engaged in the game materials. They were thoroughly engaged in the simulation (in residence).
Testing the game with a number of people assisted the team in cleaning up any errors in the materials.
Gamification/Microlearning Portion
--Context needs to be built into the beginning of the game
--Need to add short videos and more graphical keying.
--Digital resources need to be linked directly into each of the micro-learning sessions.
--Interactive Metro Map was not a value add.
--Points were a motivator for some of the students.
--Topic jeopardy game reviews were linked to outside of the game need to be placed inside the game. Separate windows were confusing especially on phones.
--Create an overall game review and release it the last week the game is open.
Residential Course
--Students really liked the simulation and how it brought the students through the entire process.
--Game review—shorter.
--Students wanted clearer instructions for their individual roles.
--The simulation could be enhanced by “injects” and a daily newsletter giving news and schedule for the day.
--Speakers were the right amount to reinforce the course learning outcomes and the simulation activities.
Overall
Pilot GamEffective platform for a second year. During this year, explore and refine security requirements for a full deployment, run multiple courses from the platform at one time, and incorporate portions of other EMDC courses.
Emails need to give more information setting expectations. This information was provided in CLD Central course space in a discussion space, but it wasn’t read by the students.
Online gamification/mircorlearning
Build an opening context for all game materials.
Link digital resources directly to each of the micro-learning sessions.
Add more multimedia components to micro-learning sessions.
Residential Portion
Shorten the game review session. Put it in digital form that can be projected.
Create role sheets for each of the simulation roles with information on the role, product deliveries/schedule, etc.
Create injects for the simulation to challenge the students to change direction.
Provide a morning newsletter to report on the previous day and provide a schedule for the current activities.
First, three of the students mentioned a problem accessing the game on their government computer. One student couldn’t access the game on their smart phone. Another student had access problems on his tablet. The second common issue was the distorted display of the game on Internet Explorer. When students changed to Firefox or Chrome browsers, the game displayed correctly. The third issue for some was performance of the game platform. These problems included the delay or failure to save activity completions, lag, stalling and screens freezing. The final issue that student’s reported related to the Jeopardy review games created by EMDC staff and housed on a separate website from GamEffective. These games were completed in a separate window. There was not a return link to GamEffecive. Students did not realize they had to click on a check mark in GamEffective to indicate they had completed the game.
The fifth question in the survey asked students to propose changes for the game. The most common response was positive. Examples of these included: “Excellent learning tool. I was surprised by the amount of information I was able to retain!,” and “if I could export it, I would share the format with my agency. Well structured, formatted and very beneficial.” However, there were a couple of students who did not like the format. For example, “Some people learn well this way, some don't - I don't.” There were a few students who suggested that we shorten the game. There were several responses that referred to proof-reading the game materials.