BISG's MIP for Higher Ed Publishing 2013 -- Rebecca Griffiths
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

BISG's MIP for Higher Ed Publishing 2013 -- Rebecca Griffiths

on

  • 811 views

Rebecca Griffiths of Ithaka S+R presents at BISG's Making Information Pay for Higher Ed Publishing conference, February 7, 2013.

Rebecca Griffiths of Ithaka S+R presents at BISG's Making Information Pay for Higher Ed Publishing conference, February 7, 2013.

Statistics

Views

Total Views
811
Views on SlideShare
811
Embed Views
0

Actions

Likes
0
Downloads
16
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

BISG's MIP for Higher Ed Publishing 2013 -- Rebecca Griffiths Presentation Transcript

  • 1. Now welcoming Rebecca Griffiths Program Director for Online Learning, Ithaka S+R “The Promise of Highly Interactive Online Learning”© 2013, the Book Industry Study Group, Inc. 1
  • 2. Interactive Online Learning in Public Universities Randomized Study of Interactive Online Statistics Course Testing the Benefits of MOOCs Rebecca Griffiths Program Director for Online Learning, Ithaka S+R Feb 7, 2012
  • 3. Motivations for Online Learning Program  Budget crises and pressure to increase graduation rates, particularly at public universities  Potential of sophisticated, interactive online systems, especially when used in hybrid mode, to • Improve student outcomes • Reduce disparities in outcomes • Reduce costs without sacrificing learning  Need for solid evidence about effectiveness or cost-saving potential of such systems
  • 4. Randomized Study of Interactive Online StatisticsCourse • Study conducted by William G. Bowen, Matthew Chingos, Tom Nygren, Kelly Lack • Completed spring 2012 • Test of introductory statistics course developed at Carnegie Mellon University  Primarily text-based with cognitive tutors • Hybrid vs. traditional face-to-face class sections
  • 5. Summary of Research Design  Registration and recruitment (not necessarily in that order) • Students register for introductory statistics course • Students recruited to participate in study  Random assignment of participants • Traditional section of statistics course • Hybrid section of statistics course  Baseline measures (beginning of semester) • Student background survey • Score on standardized test of statistical reasoning (CAOS test)  Outcome measures (end of semester) • Completion rate • Pass rate • Score on common final exam questions • Score on CAOS test (second administration of test) • Student satisfaction survey
  • 6. Fall 2011 Study Sizes Traditional Hybrid Total Institution A 45 52 97 Institution B 112 117 229 Institution C 45 47 92 Institution D 7 9 16 Institution E Department 1 16 15 31 Department 2 24 26 50 Institution F 43 47 90 Total 292 313 605
  • 7. Demographics of Study Participants Adjusted Traditional Hybrid Signif.? Diff.Proportion of students who are black 14% 14% 0%Proportion of students who are Hispanic 20% 14% -5%Proportion of females 54% 61% 7% +Proportion of students whose parents’ 49% 50% -2% +income is <$50,000 a yearProportion of students who have at least 49% 47% 2%one parent with a college degreeProportion of full-time students 90% 90% 0%Mean cumulative college GPA 2.63 2.63 -0.01Notes: Adjusted differences (average within-institution differences) control for institutional dummy variables.“Signif.?” indicates whether the result is statistically significant from zero at ** p<0.01, * p<0.05, + p< 0.10.
  • 8. How Participants Compare with Non-Participants Non- Adjusted Participants participants Signif.? Diff. Proportion of students who are black 14% 13% 0% Proportion of students who are Hispanic 17% 10% 3% * Proportion of females 58% 56% 1% Proportion of full-time students 90% 86% 5% ** Mean age 21.9 21.6 -0.3 Mean cumulative college GPA 2.63 2.24 0.12 * Proportion of students who passed course 78% 81% -5% * Notes: Adjusted differences control for institutional dummy variables. “Signif.?” indicates whether the result is statistically significant from zero at ** p<0.01, * p<0.05, + p< 0.10.
  • 9. Completion Rates and Pass Rates Completion and Pass Rates (Percentages) 100 90 88% 84% 81% 80 78% 70 60 50 Traditional Hybrid 40 30 20 10 0 Completion Rate Pass Rate (n=605) (n=605) Results depicted control for institution effects and were not significant at p<0.10.
  • 10. Performance on End-of-Semester Assessments Post-Course CAOS Scores and Scores on Common Final Exam Questions (Percentage of Questions Answered Correctly) 100 90 80 70 57% 59% 60 50 47% 48% Traditional 40 Hybrid 30 20 10 0 Post-Course CAOS Score Common Final Exam Questions (n=458) (n=431) Results depicted control for institution effects and were not significant at p<0.10.
  • 11. Results by SubgroupSubgroup Pass Post- Final Rate CAOS ExamBlack/Hispanic 0.02 0.00 -0.00 N=188 N=143 N=131White/Asian 0.05 0.01 0.03 N=406 N=308 N=292Male 0.04 -0.00 -0.00 N=257 N=194 N=173Female 0.05 0.01 0.04 N=348 N=264 N=258First-generation college students 0.01 -0.00 0.02 N=316 N=231 N=258Students who have at least one parent with a college degree 0.07 0.01 0.03 N=289 N=227 N=216Pre-CAOS test low 0.02 0.01 -0.03 N=266 N=215 N=196Pre-CAOS test high -0.02 0.00 0.06+ N=265 N=234 N=222Notes: Significant at +p<0.10. Results depicted control for institution effects.
  • 12. End-of-Semester Student Survey Responses Mean Course Rating and Amount Learned at End of Semester 4.0 About the 3.5 vertical axis: Rating / Amount Learned 3.0 0 = Rated course much worse than 2.5 typical lecture- 2.10 2.10 based course / 1.85 + 1.89 + Learned much 2.0 Traditional less Hybrid 1.5 4 = Rated course much better than typical lecture- 1.0 based course / Learned much 0.5 more 0.0 Overall Rating Amount Learned (n=435) (n=438) Significant at +p<0.10. Results depicted control for institution effects.
  • 13. Takeaways of Empirical Study Students in the hybrid sections had roughly similar learning outcomes to students in traditional-format sections. Our finding of no significant differences is precisely estimated We also calculated results separately for each institution, and for subgroups of students, defined in terms of characteristics like race/ethnicity, gender, parental education, primary language spoken, and GPA. • Results broken down by institution did not reveal any noteworthy patterns. • We did not find any evidence that the hybrid-format effect varied by any subgroup characteristics. Worries that use of online courses may hurt basic student learning outcomes do not appear to be well-founded.
  • 14. Instructor Experience 25 Average Years of College-Level Teaching Experience 20 15 10 5 0 Hybrid Instructors Face-to-face Instructors
  • 15. Instructor Interaction with Students Average Classroom time Attendance Rates Spent Lecturing 100 70 60 80 50 60 40 30 40 20 10 20 0 0 Hybrid Face-to-face Hybrid Face-to-face Instructors Instructors Instructors Instructors
  • 16. Instructors’ Assessment of the Hybrid CourseLong term impact on time spent:• If hybrid course used regularly, 5 out of 10 instructors said much less or somewhat less time would be spent in the long run; only 1 said somewhat more timeEvaluation of online course:• Mixed reviews; most found it acceptable, but all mentioned a few areas of mismatch.• Many instructors believed their students had negative views of the online course.CMU course is a good prototype, but there is room for improvement
  • 17. Next steps • Encourage the development of more high quality, interactive, customizable online learning systems and content • More evidence • Further exploration of the potential for cost savings
  • 18. Along came 2012 Class of MOOCs – MassivelyOpen Online Courses • Massive – some have attracted over 100,000 registrations • Open – freely accessible to anyone with internet connection • Online Courses – • Led by an instructor at an institution • Have a beginning and an end • Have lectures, in-video quizzes, assignments, quizzes and tests • Heavy reliance on peer collaboration, even for grading • Offer certificates, exploring options for accreditation (e.g. testing centers, ACE) Not quite what institutions need, but is there a way to bridge the gap?
  • 19. Testing the Benefits of MOOCs Partnership with the University System of Maryland to test the hypothesis that MOOCs can be used to improve student outcomes and/or reduce costs within a public university system. Research plan:: » 5-7 controlled side-by-side tests » 5-10 case studies Why not randomized? Things are moving too fast!
  • 20. What We Aim to Learn • Can MOOCs be used to improve student outcomes? • What implementation challenges arise, and how can these be overcome? • What models of adoption are there? What are the potential benefits and challenges of each? • What can we learn about cost savings?
  • 21. Other Things We Might Learn • Will MOOCs be adopted like multimedia textbooks? • How can one tell a good MOOC from a bad MOOC? • What are the key differences between MOOC platforms? • Which features of MOOCs work well in a campus environment? Which do not? • What conditions are conducive to success?
  • 22. How Do / Will MOOCs “Make Information Pay”? • Student – instructor – developer feedback loops enable constant improvement of courses • More data and better analytics needed to model student profiles, behavior, experience, knowledge, etc. • Ownership of these data will be a key issue
  • 23. Interactive Online Learning in Public Universities Rebecca Griffiths (rgriffiths@ithaka.org)
  • 24. Participating Institutions City University of New York • Baruch College • Borough of Manhattan Community College* • City College State University of New York • University of Albany • SUNY-Institute of Technology • Nassau Community College* University of Maryland • Baltimore County • Towson University Montgomery County College, Maryland* * Data from these institutions were analyzed separately and are not included in this presentation. Cautionary note: We cannot assume that the findings presented today for 4-year public institutions necessarily hold for community colleges, nor can we compare outcomes at community colleges with outcomes at 4-year institutions.