BISG's MIP for Higher Ed 2012 -- SAXBERG

  • 622 views
Uploaded on

 

More in: Education
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
622
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
15
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Innovations in Higher Education LearningBror SaxbergCLO, Kaplan, Inc.February 8, 2012
  • 2. What’s new for learning?OverviewCognitive Task AnalysisKaplan Way – Kaplan University Course RedesignsQ and A 1
  • 3. Bror SaxbergChief Learning Officer, Kaplan, Inc. • Integrating the design, building, monitoring, and improvement of learning environments; individualize learning experiences using our scale; and, ultimately, drive greater student career success. • Former CLO for K12, Inc. – structured use of technology, cognitive science, on-line and off-line materials for 1,700 teachers, 55k students • Former Publisher and General Manager for DK Multimedia, Inc. • Management consultant with McKinsey & Company • Education: • Ph.D. in Electrical Engineering and Computer Science from MIT • M.D. from Harvard Medical School • M.A. in Electrical Engineering and Computer Science from MIT • M.A. in Mathematics from Oxford University • B.S. in Electrical Engineering and B.S. with Honors in Mathematics from the University of Washington 2
  • 4. What Our Students Told Us They Want Brand Promise Brand Pillars We strive to make We are dedicated We move quickly We are here to education as to getting you the with constant help you achieve personalized to you results that matter innovation to success at critical Pillar as in the time that better meet your milestones alongDefinitions possible−tailoring matters. needs. your educational our courses around journey. your individual needs. 3
  • 5. To respond, consider structuring key initiatives to takeadvantage of what’s known about learning – and data Rapidly test and scale learning innovations 4
  • 6. What’s new for learning?OverviewCognitive Task AnalysisKaplan Way – Kaplan University Course RedesignsQ and A 5
  • 7. Employers actually expect job applicants to lack the occupational/technical skills required to do the job… Do you expect job applicants to be lacking specific occupational skills or technical skills? • Slightly over half of all respondents (52.8%) expected that job applicants would lack occupational skills • In healthcare, where occupational certifications and licensures are required, over 68% of respondents expect that job applicants would lack occupational skillsMarch 2011 Workforce Connections, Inc. survey of employers in western Wisconsin. Over 400 employers fromall 8 counties responded to the survey. All sizes of businesses were represented with the majority of responsescoming from businesses with less than 50 employees. 6
  • 8. Cognitive Task Analysis (CTA) may provide answers• CTA is an interview strategy for capturing how highly successful experts perform complex tasks in a variety of settings• Goal is to develop authentic demonstration and practice opportunities for how to perform at expert levels• Experts are interviewed who 1) have recent (past 2-3 mo.) experience, 2) are consistently successful, and 3) are NOT trainers.• Interviews are done with 3-4 experts to unpack their strategies; these are merged to make an efficient approach suitable for training• A range of problem examples or performance scenarios are collected from the experts for use in instruction as well 7
  • 9. Medical Assistant current course content: Pharmacology course Diseases - human body X = substantial content; x = ancillary content 8
  • 10. MA CTA: Identifies key tasks/skills performed by experts Original content New focus • Tie to domain tasks as identified by experts 9
  • 11. MA Program: Skills addressed in new sequence New focus • Tie to domain tasks as identified by experts 10
  • 12. MA Program: Skills addressed in new sequence • Tie to domain tasks as identified by experts • Repeated use of skills across courses B: Begin; A: Advanced; R: Reinforce 11
  • 13. MA Program: New courses include previous content • Tie to domain tasks as identified by experts • Repeated use of skills across courses • Original concepts spread across task instruction, not confined to courses 12
  • 14. What’s new for learning?OverviewCognitive Task AnalysisKaplan Way – Kaplan University Course RedesignsQ and A 13
  • 15. A lot is known about what drives learning now Instructional Events Student Learning Events (in the learning (hidden - inside students’ Performance environment) minds) (observable -indicates knowledge) 14
  • 16. A lot is known about what drives learning now Instructional Events Student Learning Events (in the learning (hidden - inside students’ Performance environment) minds) (observable -indicates knowledge)KnowledgeMotivationMetacognition 15
  • 17. A lot is known about what drives learning now Instructional Events Student Learning Events (in the learning (hidden - inside students’ Performance environment) minds) (observable -indicates knowledge) • Explicit: Information, Explanation, • Explicit/Declarative/Conceptual/What • Response accuracy/errors Examples, Demos • Implicit/Procedural/How • Response fluency/speed • Implicit: Practice tasks/activities • Knowledge Components (Procedures • Number of trialsKnowledge (prompts and response) + Facts, Concepts, Principles, • Amount of assistance (hints) • Diagnosis and feedback Processes) • Reasoning • Orientation/Inoculation • Value beliefs • Behavior related to • Monitoring • Self-efficacy beliefs • StartingMotivation • Diagnosis and treatment: • Attribution beliefs • Persisting Persuasion, Modeling, • Mood/Emotion • Mental Effort Dissonance • Self-reported beliefs • Structure • Planning, Monitoring • Amount of guidance • Guidance • Selecting, Connecting required/requestedMetacognition See: Koedinger, K.R., Corbett, A.T., and Perfetti, C. (2010). The Knowledge-Learning-Instruction (KLI) Framework: Toward Bridging the Science-Practice Chasm to Enhance Robust Student Learning 16
  • 18. Task-centered instruction• Move from simple to increasingly difficult tasks – NOT “PBL” sink or swim• Teach everything needed for each task• Fade coaching/support over time 17
  • 19. ID can change instructional outcomes at scalePrinciple Description Effect size (s.d. units)Multimedia Use relevant graphics and text to communicate content 1.5Contiguity Integrate the text nearby the graphics on the screen – avoid covering or separating 1.1 integrated informationCoherence Avoid irrelevant graphics, stories, videos, media, and lengthy text 1.3Modality Include audio narration where possible to explain graphic presentation 1.0Redundancy Do not present words as both on-screen text and narration when graphics are present .7Personalization Script audio in a conversational style using first and second person 1.3Segmenting Break content down into small topic chunks that can be accessed t the learner’s 1.0 preferred ratePre-training Teach important concepts and facts prior to procedures or processes 1.3Etc. Worked examples, self-explanation questions, varied-context examples and ?? comparisons, etc. Source: E-learning and the Science of Instruction, Clark and Mayer, 2nd ed., 2008 18
  • 20. Impact is not small! 1 sd 50% 84%! 19
  • 21. Instructional Design process should follow evidenceThe evidence about learning points to a sequence of activities thatoptimizes learning. Design goes one way, delivery the other. Design Learning Overviews Information Examples Practice Assessment Outcomes Guidance (for motivation and metacognition) Delivery 20
  • 22. In 2011 KU and KLI launched a course redesign pilot 1. Apply “Kaplan Way” evidence-based instructional design to several Kaplan University courses (high volume, needed improving): 2. Deliver the courses in a simplified e-College template. 3. Develop replicable/scalable process, templates, technology. 4. Evaluate the impact on student outcomes •Pilot 1: August 3 – October 12 •Pilot 2: October 19 – December 28 21
  • 23. The student experience: before and afterRead, Write, Discuss• Outcomes and content sometimes Prepare, Practice, Perform loosely aligned • Outcomes and content precisely aligned• Limited demonstrations, worked • Frequent demonstrations, examples, and practice worked examples, practice, feedback• General assessment rubrics • Detailed scoring guides• Reliance on discussion boards • Evidence-based support for motivation• Limited support for motivation • Instructor coaching 22
  • 24. Results: Significant learning and business impact from KUcourse redesigns – and more to come Learning impact Financial impactHigher instructor satisfaction: 3% retention gain• Instructors see benefits of design; instructor •Significant benefit to learners and university materials/support for facilitator role 14% gain in “student success”Lower student satisfaction:• Courses more demanding and time consumingHigher retention, less withdrawals:• Support for at-risk students to stay engagedMore time-on-task:• Students in pilot versions of courses spend more time online in courseBetter learning outcomes:• Pilot students earn higher CLA scores and higher scores on common assessments 23
  • 25. The pilot courses delivered an 14% difference in studentsuccess rate—a 50% increase over control courses Pilots 1 and 2 combined analysis: Group differences in student “success” “Success” = CLA Average >=4 AND passed course AND retained to next term Controlling for differences in course, students, instructors and seasonality 42% Statistical Significance Least Squares Means for effect grp Pr > |t| for H0: LSMean(i)=LSMean(j) Dependent Variable: success i/j n 1 2 3 4 28% 1 23,748 0.9795 <.0001 0.914 2 6,121 0.9795 <.0001 0.9584 3 508 <.0001 <.0001 <.0001 4 582 0.914 0.9584 <.0001 1 2 3 4 24
  • 26. Results: Significant learning and business impact from KUcourse redesigns – and more to come Learning impact Financial impactHigher instructor satisfaction: 3% retention gain• Instructors see benefits of design; instructor •KU OIE’s team estimates a return of $1.5M in materials/support for facilitator role OI annually from an investment of $375KLower student satisfaction: 14% gain in “student success”• Courses more demanding and time •Success” = CLA Ave>4, Passed, and Retained consuming •Analysis controlled for variations in course, student, instructor and seasonalityHigher retention, less withdrawals: •Translates to a 50% increase over control• Support for at-risk students to stay engaged coursesMore time-on-task:• Students in pilot versions of courses spend more time online in courseBetter learning outcomes:• Pilot students earn higher CLA scores and higher scores on common assessments 25
  • 27. Student feedback on the benefits of extra practice “Something I found to be interesting was the degree of understanding between me and another individual that wasn’t in this class. A girl I had met in a previous term that has a similar degree plan but ended up in a regular medical terminology course, still we would discuss the differences and similarities between are assigned classes. During our unit 8 test she called me hysterical about all the different elements of the final tests and couldn’t seem to grasp the concept of the 1st part of the test i.e., analysis diagram, creating new terms from word roots etc. I w as m ystified that som ething that had becom e 2 nd nature to m e m ainly due to the tim e spent every w eek filling out the Analysis Tables w as so difficult for her to com prehend. I t w as at that point I realized all the griping I had done w as actually the reason m y level of understanding is m ore evolved than som ebody w ho never ex perienced it.” 26
  • 28. What’s new for learning?OverviewCognitive Task AnalysisKaplan Way – Kaplan University Course RedesignsQ and A 27
  • 29. Appendix: Initial readings for “learning engineers”• Why Students Don’t Like School, Daniel Willingham – highly readable! ;-)• Talent is Overrated, Geoffrey Colvin – highly readable! ;-)• E-Learning and the Science of Instruction, Clark and Mayer, 3rd ed.• “First Principles of Learning,” Merrill, D., in Reigeluth, C. M. & Carr, A. (Eds.), Instructional Design Theories and Models III, 2009.• How People Learn, John Bransford et al, eds.• “The Implications of Research on Expertise for Curriculum and Pedagogy”, David Feldon, Education Psychology Review (2007) 19:91–110• “Cognitive Task Analysis,” Clark, R.E., Feldon, D., van Merrienboer, J., Yates, K., and Early, S.. in Spector, J.M., Merrill, M.D., van Merrienboer, J. J. G., & Driscoll, M. P. (Eds.), Handbook of research on educational communciatinos and technology (3rd ed., 2007) Lawrence Erlbaum Associates 28