Csulb Presentation November 2008

497 views

Published on

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
497
On SlideShare
0
From Embeds
0
Number of Embeds
5
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Talk about my courses Face-to-face elements Online elements Talk about the evaluation of these courses Tools/techniques Findings/results The latest at SDSU/take-home messages
  • Csulb Presentation November 2008

    1. 1. Large Classes and Blended Learning: What Makes It Work Mark A. Laumakis, Ph.D. San Diego State University Lecturer, Department of Psychology Faculty in Residence, Instructional Technology Services [email_address]
    2. 2. What I Teach: Mega Courses <ul><li>Two 500-student sections of Psychology 101 (Introductory Psychology) </li></ul><ul><ul><li>One fully face-to-face (traditional) </li></ul></ul><ul><ul><li>One in a blended learning format (45% online) </li></ul></ul>
    3. 3. Setting the Stage <ul><li>Spent Summer 2006 redesigning Psych 101 for a blended learning format </li></ul><ul><ul><li>Blended learning integrates online and face-to-face activities in a planned, pedagogically valuable manner (Sloan-C Workshop on Blended Learning, 2005) </li></ul></ul><ul><li>Utilized fundamental principles of instructional design </li></ul><ul><li>Employed scholarship of teaching approach </li></ul>
    4. 4. <ul><li>Extensive use of CPS clickers </li></ul><ul><ul><li>ConceptCheck questions </li></ul></ul><ul><ul><li>Attendance </li></ul></ul><ul><ul><li>Demonstrations </li></ul></ul><ul><ul><li>Anonymous polling </li></ul></ul><ul><ul><li>Predicting outcomes </li></ul></ul><ul><ul><li>Peer instruction (Mazur) </li></ul></ul><ul><li>Extensive use of multimedia </li></ul><ul><ul><li>Videos, demonstrations, and simulations from text and web </li></ul></ul>Face-to-Face Classes
    5. 5. Clicker ConceptCheck Question
    6. 6. Clicker Results Chart
    7. 7. Clicker Data: Spring 2008 Question % Agree or Strongly Agree Class clicker usage makes me more likely to attend class. 92% Class clicker usage helps me to feel more involved in class. 84% Class clicker usage makes it more likely for me to respond to a question from the professor. 92% I understand why my professor is using clickers in this course. 94% My professor asks clicker questions which are important to my learning. 92%
    8. 8. Online Sessions <ul><li>Delivered via Wimba Live Classroom </li></ul><ul><li>Live sessions were archived for later viewing </li></ul><ul><li>Sessions included </li></ul><ul><ul><li>Mini-lectures </li></ul></ul><ul><ul><li>Demonstrations </li></ul></ul><ul><ul><li>Polling questions </li></ul></ul><ul><ul><li>Feedback at the end of each session via polling questions </li></ul></ul>
    9. 9. Wimba Live Classroom Interface
    10. 10. Polling Question in Live Classroom
    11. 11. Review of Key Tools <ul><li>Face-to-Face Classes </li></ul><ul><li>PowerPoint </li></ul><ul><li>CPS clickers </li></ul><ul><li>Tablet PC </li></ul><ul><li>Online Sessions </li></ul><ul><li>Wimba Live Classroom </li></ul>
    12. 12. Fall 2006-Spring 2007 Evaluation <ul><li>Evaluation led by Marcie Bober, Ph.D. (Educational Technology) </li></ul><ul><li>Efforts supported by Academic Affairs, Instructional Technology Services, and College of Sciences </li></ul><ul><li>Initial evaluation is part of ongoing evaluation process </li></ul><ul><ul><li>Course (re)design is an iterative process </li></ul></ul><ul><ul><li>Focus on continuous improvement </li></ul></ul>
    13. 13. Evaluation Tools and Strategies <ul><li>Multimethod approach included the following: </li></ul><ul><li>Week 7 “How’s It Going?” Online Survey </li></ul><ul><li>In-class Observations </li></ul><ul><li>IDEA Diagnostic Survey </li></ul><ul><li>Student Focus Groups </li></ul><ul><li>Departmental Course Evaluations </li></ul><ul><li>Course Grades </li></ul>
    14. 14. Evaluation Findings: IDEA Diagnostic Survey
    15. 15. Evaluation Findings: IDEA Diagnostic Survey Note: Top 10% = 63 or more 62 65 70 Fall 2006 Blended 72 68 73 Fall 2006 Traditional 71 73 Excellent course 68 69 Excellent teacher 77 77 Progress on objectives Spring 2007 Traditional Spring 2007 Blended
    16. 16. Evaluation Findings: Departmental Course Evaluations
    17. 17. Evaluation Findings: Fall 2006 Course Grades
    18. 18. Evaluation Findings: Spring 2007 Course Grades
    19. 19. Evaluation Findings: Course Grades Fall/Spring Combined
    20. 20. Evaluation Findings: Fall 2007 Course Grades
    21. 21. Evaluation Findings: Spring 2008 Course Grades
    22. 22. Summary of Course Grade Data
    23. 23. The Learning Continuum 20% 40% 60% 80% Entirely On-line Classes Conventional Face-to-Face Classes
    24. 24. Blended Learning = “The Sweet Spot” 20% 40% 60% 80% Entirely On-line Classes Conventional Face-to-Face Classes “ The Sweet Spot”
    25. 25. What’s the Latest? <ul><li>Introduction of more blended learning courses at SDSU </li></ul><ul><ul><li>Students now seek out the blended learning section </li></ul></ul><ul><li>Continued evolution of online sessions </li></ul><ul><ul><li>Less lecture </li></ul></ul><ul><ul><li>More demonstrations, simulations, and polling questions </li></ul></ul><ul><li>Fully online Psych 101 course in Summer 2008 </li></ul><ul><ul><li>Course enrollment of 66 students vs. average of 46 in previous 5 years (traditional face-to-face course) </li></ul></ul><ul><ul><li>D/F rate dropped from 14.1% to 11.0% </li></ul></ul>
    26. 26. Lessons Learned <ul><li>Yes, you can do blended learning in a mega course! </li></ul><ul><li>Course redesign takes time and effort </li></ul><ul><li>Support is key </li></ul><ul><li>Moving to blended learning format does NOT mean moving your face-to-face course online </li></ul><ul><ul><li>You must change the way you teach </li></ul></ul><ul><li>Provide rationale to students </li></ul><ul><ul><li>Why you’re doing what you’re doing </li></ul></ul><ul><li>Predict problems with technology </li></ul>

    ×