Slides for DET/CHE December 2010


Published on

Slides I used for my presentation at the 25th DET/CHE conference, December 2010, San Diego.

Published in: Education
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Paperback: 368 pagesPublisher: Jossey-Bass; 1 edition (April 22, 2002)Language: EnglishISBN-10: 0470623071ISBN-13: 978-0470623077
  • image:
  • there are counter-examples, but they are so few in number that they don’t count. Infers that it’s a voting contest.
  • NSD 1: use of technology makes NSD (independent from online/dist vs. F2FNSD 1: NSD between online and F2F
  • NSD 1: use of technology makes NSD (independent from online/dist vs. F2FNSD 1: NSD between online and F2F
  • NSD 1: use of technology makes NSD (independent from online/dist vs. F2FNSD 1: NSD between online and F2F
  • Slides for DET/CHE December 2010

    1. 1. Seeking Evidence of Impact: Towards evidence-based practice<br />Malcolm Brown<br />Director, EDUCAUSE Learning Initiative<br /><br />
    2. 2. Take the survey!<br /><br />
    3. 3.
    4. 4.
    5. 5. Seeking Evidence of Impact<br /><br />
    6. 6. Advisory team<br />Randy Bass, Georgetown<br />Gary Brown, Washington State<br />Joanne Dehoney, East Carolina University<br />Chuck Dziuban, University of Central Florida<br />John Fritz, U Maryland Baltimore County<br />Joan Lippincott, CNI<br />Phil Long, U Queensland<br />Vernon Smith, Rio Salado College<br />
    7. 7. impact<br />evidence<br />assessment<br />evaluation<br />seeking evidence of the impact of our technology-based innovations and practices in support of teaching and learning<br />
    8. 8.
    9. 9. “Does it work?”<br />
    10. 10. “Does it work?”<br />
    11. 11. “Does it work?”<br />
    12. 12. “Does it work?”<br /><your project goes here><br />
    13. 13. “Does it work?”<br />Do the students learn “better”?<br />Do the faculty teach “better”?<br />Are the students more engaged?<br />Are the student evaluations “better”?<br />Is it “efficient”?<br />
    14. 14. So… do we<br />Know what we mean by “evidence”?<br />Know what we mean by “impact”?<br />Know how to best gather evidence?<br />Know how to best analyze impact?<br />Know how to have evidence improve practice?<br />
    15. 15. Why collect evidence now?<br />
    16. 16. pressure<br />
    17. 17. Enormous fiscal pressures<br />
    18. 18. Enormous fiscal pressures<br />Huge demand for HE<br />accountability<br />
    19. 19. Standing call for evidence-based practice<br />
    20. 20. “1. Establishing and supporting a culture of evidence”<br />“Given that college education is now one of the most important and expensive investments for American families, the call for accountability in higher education has intensified.”<br />“Colleges and universities… are recognizing the need for better systems that move beyond counting objects (such as computers, books, and so on) to measuring learning outcomes.”<br />
    21. 21. What administrative leadership and support might be required?<br />What learning outcomes need to be tracked?<br />What groups on campus need to be represented during this process?<br />How translate the data into best practices?<br />
    22. 22. “Most campus assessment activities… continue to be implemented as additions to the curriculum… rather than being integral to teaching and learning. [It] centers on “doing assessment” rather than on improving practice… Although firmly established in the mainstream by the year 2000, assessment as a movement is still striving for the cultural shift its original proponents had hoped for.”<br />Peter Ewell, pages 16–17<br />2002 <br />ISBN-10: 0470623071<br />
    23. 23. Unprecedented technology change<br />Major culturalchange<br />
    24. 24. Blackboard<br />24<br />
    25. 25. Why Blackboard bought McGraw-Hill<br />Blackboard<br />25<br />
    26. 26. Higher ed under pressure<br />
    27. 27. The paradigm is shifting<br />paradigm?<br />
    28. 28. New paradigm<br />
    29. 29. student project design<br />research methods<br />formative assessment<br />the course today<br />summative assessment<br />use of new media<br />in-class pedagogy<br />content<br />syllabus<br />web site<br />lecture preparation<br />
    30. 30. “Improvement in post-secondary education will require converting teaching from a ‘solo sport’ to a community-based research activity.”<br />–Herbert Simon<br />
    31. 31.
    32. 32. “Students starting school this year may be part of the last generation for which ‘going to college’ means packing up, getting a dorm room and listening to tenured professors.”<br />“Undergraduate education is on the verge of a radical reordering… The business model that sustained private US colleges cannot survive.”<br />“The typical 2030 faculty will likely be a collection of adjuncts alone in their apartments, using recycled syllabuses and administering multiple-choice tests from afar.”<br />
    33. 33. “Traditional higher education no longer holds a de facto monopoly in any of its primary functional areas, as it once did. ”<br />“Higher education is left with only one choice: innovate in order to stay relevant.”<br />
    34. 34. “Newspapers are dying. Are universities next? The parallels between the two are closer than they appear.”<br />“… it would be a grave mistake to assume that the regulatory walls of accreditation will protect traditional universities forever.”<br />“Perhaps the higher-education fuse is 25 years long, perhaps 40. But it ends someday, in our lifetimes.”<br />
    35. 35.
    36. 36. Don’t we already know what works?<br />
    37. 37. The No Significant Difference (NSD) Debate<br />
    38. 38. 1983<br />“…there are no learning benefits to be gained from employing any specific medium to deliver instruction.”<br />R Clark, “Reconsidering Research on Learning from Media,” Review of Educational Research, vol. 53 no. 4.<br />
    39. 39. 1990s<br />“There are 355 research reports, summaries, and papers cited in which no significance difference was reported between the variables compared.”<br />
    40. 40. So Russell:<br />“Technology is not neutral…”<br />“The truth [is]… that students are not alike.”<br />“…the real challenge facing educators today is identifying the student characteristics and matching them with the appropriate technologies.”<br />T. Russell, “Technology Wars: Winners and Losers,” EDUCOM Review, Volume 32, Number 2, March/April 1997<br />
    41. 41. So Russell:<br />“The value of interactivity—especially synchronous interactivity—according to comparative research is, at best, suspect.”<br />T. Russell, “Technology Wars: Winners and Losers,” EDUCOM Review, Volume 32, Number 2, March/April 1997<br />
    42. 42. So Russell:<br />“There no longer is any doubt that the technology used to deliver instruction will not impact the learning for better or for worse.”<br />deliver instruction<br />enable instruction<br />engage students<br />conduct instruction<br />improve instruction<br />T. Russell, “Technology Wars: Winners and Losers,” EDUCOM Review, Volume 32, Number 2, March/April 1997<br />
    43. 43. “The confounding factor here is that each medium consists of many attributes that may affect the value of the medium’s instructional impact.”<br />“To credit or blame the delivery medium for learning ignores the effectiveness of the instructional design choices made while creating a learning event.”<br />Educause Quarterly, No. 2, 2001<br />
    44. 44. 44<br />
    45. 45. 45<br />online!<br />F2F!<br />hi tech!<br />low tech!<br />X!<br />not X!<br />formal!<br />informal!<br />
    46. 46. 46<br />
    47. 47. The latest installment<br />
    48. 48. released August 2009<br />48<br />
    49. 49. “… on average, students in online learning conditions performed better than those receiving face-to-face instruction.”<br />49<br />
    50. 50. released July 2010<br />50<br />
    51. 51. “the … report does not present evidence that fully online delivery produces superior learning outcomes for typical college courses, particularly among low-income and academically underprepared students.”<br />51<br />
    52. 52. “…we find modest evidence that live-only instruction dominates Internet instruction.”<br /><br />52<br />
    53. 53. 53<br />“Students attending classes in… technology-enhanced learning spaces exceeded final grade expectations… suggesting strongly that features of the spaces contributed to their learning.”<br />“Different learning environments affect teaching-learning activities even when instructors attempt to hold these activities constant.”<br />
    54. 54. online!<br />F2F!<br />54<br />
    55. 55. It’s perplexing…<br />
    56. 56. Distraction factors<br />aka: “X is making us stupid”<br />56<br />
    57. 57. “Far from making us stupid, these technologies are the only things that will keep us smart.”<br />57<br />
    58. 58. “… home computer technology is associated with modest but statistically significant and persistent negative impacts on student math and reading test scores”<br />58<br />
    59. 59. 59<br />
    60. 60. 60<br />
    61. 61. Our current alignment<br />
    62. 62. Who responded?<br />N=175 (so far)<br />
    63. 63.
    64. 64. Most important indicators of impact?<br />
    65. 65. select 3 most important<br />
    66. 66. What’s hardest?<br />
    67. 67. scale 1 to 5<br />
    68. 68. Methods routinely used<br />
    69. 69. check all that apply<br />
    70. 70. Consensus on “evidence” and “impact”<br />
    71. 71.
    72. 72. Who currently gathers evidence?<br />
    73. 73. choose 1<br />
    74. 74.
    75. 75. Challenges<br />
    76. 76. understanding the research questions<br />what counts as evidence (what data should I collect?)<br />dependence on poor data sources<br />money; funding for analyst position<br />lack of expertise<br />how do learning spaces factor into this?<br />creating a culture of assessment<br />
    77. 77. What should ELI SEI do?<br />
    78. 78. Build Community<br />community of practice <br />share practices <br />share data/evidence <br />
    79. 79. Provide Tools<br />templates; sample surveys<br />case study collection <br />road map<br />deliver “legos” <br />ways to help faculty use data <br />how to best communicate results <br />
    80. 80. Provide Services<br />review publications <br />recommended tool set <br />do it all for free or low cost<br />create referral network<br />PD opportunities <br />webinars, workshops <br />external review service <br />
    81. 81. Build Community<br />What will ELI SEI do?<br />Provide Tools<br />Provide Services<br />
    82. 82. Community<br />Tools<br />Services<br />2011 Annual Meeting (Feb <br />14-16, Wash DC)<br />2011 Spring Focus Session<br />(online; April 13-14)<br />2012 Annual Meeting (Feb)<br />? Symposium @ <br />EDUCAUSE 2011<br />Sessions at regional conferences<br />Constituent Group<br />Case study series<br />White papers & briefs<br />Tool exemplars and<br />templates<br />Analytics<br />Online workshops<br />F2F workshops<br />ELI Forum<br />Webinars<br />
    83. 83.
    84. 84. Take the survey!Have colleagues take the survey!<br /><br />
    85. 85.<br />
    86. 86.
    87. 87. Send me email<br /><br />resource suggestions<br />what you need<br />example projects<br />exemplary practitioners<br />ways you’d like to help<br />come to ELI events<br />anything to help SEI move forward<br />
    88. 88. 88<br />Questions?Comments?<br />
    89. 89. Thank you!<br /><br />