Seeking Evidence of Impact: Answering "How Do We Know?"


Published on

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Seeking Evidence of Impact: Answering "How Do We Know?"

  1. 1. Veronica Diaz, PhD<br />Associate Director<br />EDUCAUSE Learning Initiative, EDUCAUSE<br />:::<br />League for Innovation<br />Learning College Summit, Phoenix, AZ<br />Seeking Evidence of Impact: Answering "How Do We Know" <br />
  2. 2. Today’s Talk<br />Review what it is to seek impact of (teaching and learning innovations) <br />Consider some strategies for using evaluation tools effectively <br />Determine ways to use evidence to influence teaching practices <br />Review ways to report results <br />
  3. 3. Academic instruction<br />Faculty development<br />Instructional technology<br />Instructional design<br />Library<br />Information technology<br />Senior administration <br />Other <br />Who are we?<br />
  4. 4. Why are we here?<br />
  5. 5. I am working in evaluating T&L innovations now. <br />Evaluation of T&L is part of my formal job description.<br />My campus units director or VP mandates our gathering evidence of impact. <br />A senior member of the administration (dean, president, senior vice-president) mandates gathering evidence of impact in T&L. <br />I am working as part of a team to gather evidence.<br />Accreditation processes are prompting us to begin measurement work.<br />
  6. 6. Why evidence of impact?<br />
  7. 7.
  8. 8.
  9. 9. What the community said<br />Download the Survey<br /><br />
  10. 10. Technologies to Measure<br />Web conferencing <br />LMS and individual features <br />Lecture capture <br />Mobile learning tools (laptops, ebooks, tablets) <br />Clickers<br />Collaborative tools <br />Student generated content<br />Web 2.0 and social networking technologies <br />Learning spaces <br />OER<br />Personal learning environments <br />Online learning: hyflex course design, blended learning programs, synchronous/asynchronous delivery modes, fully online programs <br />Eportfolios<br />Multimedia projects and tools: pod/vod casts <br />Simulations <br />Early alert systems <br />Cross-curricular information literacy programs<br />Large course redesigns <br />
  11. 11. Technologies and their connection/relationship to…<br />Student engagement <br />Learning related interactions <br />Shrink the large class <br />Improve student to faculty interaction <br />Student retention and success <br />Specific learning outcomes <br />
  12. 12. 12<br />3 most important indicators you use to measure <br />the evidence of impact of technology-based innovations in T&L<br />
  13. 13. What is “evidence?”<br />Grades (was frequently mentioned) <br />Learning outcomes (was frequently mentioned) <br />Satisfaction<br />Skills <br />Improved course evaluations <br />Measures of engagement and participation <br />Retention/enrollment rates <br />Graduation rate <br />Direct measures of student performance (at the course level and cumulative) <br />Interview data<br />Institutional data <br />Faculty/student technology utilization rates<br />Data on student/faculty facility and satisfaction with using technology<br />Successfully implementing technology <br />Job placement <br />Student artifacts <br />Better faculty reviews by students <br />Course redesign to integrate changes; impact on the ability to implement best pedagogical practice<br />Rates of admission to graduate schools<br />Success in more advanced courses <br />
  14. 14. Methods/techniques you ROUTINELY USE for gathering <br />evidence of impact of technology-based innovations in T&L<br />
  15. 15. Most difficult tasks associated with measurement were ranked as follows <br />Knowing where to begin to measure the impact of technology-based innovations in T&L<br />Knowing which measurement and evaluation techniques are most appropriate<br />Conducting the process of gathering evidence<br />Knowing the most effective way to analyze our evidence<br />Communicating to stakeholders the results of our analysis<br />
  16. 16. Yes<br />No<br />I have worked with evaluative data<br />
  17. 17. Course level (in my own course) <br />Course level (across several course sections)<br />At the program level (math, English)<br />At the degree level <br />Across institution or several programs<br />Other <br />I have worked with evaluative data at the<br />
  18. 18. Using evaluation tools effectively <br />
  19. 19. Technologies and their connection/relationship to<br />Student engagement <br />Learning related interactions <br />Shrink the large class <br />Improve student to faculty interaction <br />Student retention and success <br />Specific learning outcomes <br />Remember<br />
  20. 20. Triangulate to tell the full story. The impact of a curricular innovation should be “visible” from a variety of perspectives and measurement techniques. …..<br />Three most commonly used evaluation tools: <br />questionnaires (paper or online), <br />interviews (individual or focus group), and <br />observations (classroom or online). <br />
  21. 21. 5 Steps<br />Establish the goals of the evaluation: What do you want to learn? <br />Determine your sample: Whom will you ask?<br />Choose methodology: How will you ask? <br />Create your instrument: What will you ask? <br />Pre-test the instrument: Are you getting what you need? (PILOT YOUR TOOLS/STRATEGIES)<br />
  22. 22. What is a good question?<br />Significance: It addresses a question or issue that is seen as important and relevant to the community<br />Specificity: The question focuses on specific objectives<br />Answerability: The question can be answered by data collection and analysis;<br />Connectedness: It’s linked to relevant research/theory<br />Coherency: It provides coherent explanations that rule out counter-interpretations<br />Objectivity: The question is free of bias<br />Whom does your evidence need to persuade? <br />
  23. 23. Quantitative. This approach starts with a hypothesis (or theory or strong idea), and seeks to confirm it. <br />Qualitative.These studies start with data and look to discover the strong idea or hypothesis through data analysis. <br />Mixed. This approach mixes the above methods, combining the confirmation of a hypothesis with data analysis and provides multiple perspectives on complex topics. <br />Example: starting with a qualitative study to get data and identify the hypothesis and then following on with a quantitative study to confirm the hypothesis. <br />
  24. 24. The Double Loop<br />
  25. 25. Methods?Support in data collection?Double loop?<br />
  26. 26. Using evidence to influence teaching practices <br />
  27. 27. “higher education institutions seem to have a good understanding of the assessment process through the use of rubrics, e-portfolios, and other mechanisms, but the difficulty seems to be in improving the yield of the assessment processes, which is more of a political or institutional culture issue” <br />
  28. 28. Why We Measure<br />Inward (course level, inform teaching, evaluate technology use, reflective) <br />Outward <br />Share results with students <br />Share results with potential students <br />Share results with other faculty (in/out of discipline) <br />Share results at the institutional or departmental level (info literacy, writing, cross course projects) <br />Results can be a strategic advantage <br />
  29. 29. Lessons from Wabash National Study<br />a 3-year research and assessment project<br />provides participating institutions extensive evidence about the teaching practices, student experiences, and institutional conditions that promote student growth across multiple outcomes<br />Inputs – the attitudes and values that students bring into college<br />Experiences – the experiences that impact students once they are in college<br />Outcomes – the impact that college has on student ability and knowledge<br /><br />
  30. 30. Measuring student learning and experience is the easiest step in the assessment process. The real challenge begins once faculty, staff, administrators, and students at institutions try to use the evidence to improve student learning.<br /><br />
  31. 31. Lessons from Wabash National Study<br /><ul><li>Faulty assumptions about using evidence to improve:
  32. 32. lack of high-quality data is primary obstacle for using assessment evidence to promote improvements
  33. 33. providing detailed reports of findings is the key mechanism for kicking of a sequence of events culminating in evidence-based improvements
  34. 34. intellectual approach that faculty and staff use in their scholarship facilitates assessment projects</li></li></ul><li>Lessons from Wabash National Study<br />Perform audits of your institutions information about student learning and experience<br />Set aside resources for faculty, student & staff responses before assessment evidence is shared<br />Develop communication plans to engage a range of campus representatives in data discussions<br />Use conversations to identify 1-2 outcomes on which to focus improvement efforts<br />Engage students in helping to make sense of and form responses to assessment evidence.<br />
  35. 35. Download the rubrics:<br />
  36. 36.<br />Research Grants: <br />2010 & 2011<br />
  37. 37.
  38. 38. Organizational Level Data <br />Learner Satisfaction <br />Student Learning <br />Student satisfaction higher in QM reviewed courses & non-reviewed courses than in courses at non-QM institutions. (Aman dissertation, Oregon State, 2009)<br />Course evaluation data showed student satisfaction increased in redesigned courses. (Prince George’s Community College, MD, 2005)<br />Currently conducting a mixed methods study student & faculty perceptions of QM reviewed courses. (University of the Rockies)<br />Grades improved with improvements in learner-content interaction (result of review). (Community College of Southern Maryland, 2005)<br />Differences approaching significance on outcome measures. (Swan, Matthews, Bogle, Boles, & Day, University of Illinois/Springfield, 2010+)<br />QM Rubric implementation positive effect on student higher-order cognitive presence & discussion forum grades via higher teaching presence. (Hall, Delgado Community College, LA, 2010)<br />
  39. 39. Organizational Level Data <br />Teacher Learning<br />Organizational Learning <br />Use of QM design standards led to “development of a quality product, as defined by faculty, course designers, administrators, and students, primarily through faculty professional development and exposure to instructional design principles” (p. 214). (Greenberg dissertation, Ohio State, 2010)<br />Currently utilizing TPACK framework to explain process by which new online teachers use the QM rubric and process when designing an online course. (University of Akron)<br />There may be a carryover effect to non-reviewed courses when institution commits to the QM standards. (Aman dissertation, Oregon State, 2009)<br />Faculty/design team respond different when QM presented as a rule rather than a guideline. (Greenberg dissertation, Ohio State, 2010)<br />Extended positive impact on faculty developers & on members of review teams. (Preliminary analysis 2009; comprehensive summer 2011)<br />
  40. 40. Alignment in the curriculum between course objectives, goals, and assessments <br />Faculty members identify which assignments they have aligned with learning objectives<br />Design rubrics or instructions to prompt them at various data-collection points<br />Departments or colleges are asked to report data online at the end of each term with prompts for comparison and reflection<br />Doing so makes the data ready for larger scale assessment efforts<br />Session Recording and Resources: <br /><br />
  41. 41. What organizational mechanisms do you have in place to measure outcomes? <br />
  42. 42. Reporting results <br />
  43. 43. Match your research design to the type of information in which your anticipated consumers are interested or to which they will best respond. <br />Match your data-collection method to the type of data to which your information consumer will respond or is likely to respect. <br />
  44. 44. Keep it simple, to the point, and brief. Know who is consuming your data or research report, who the decision makers are, and how your data is being used to make which decisions, if any. <br />Although time-consuming, it might be worthwhile to tailor your reports or analysis to the audience so as to emphasize certain findings or provide a deeper analysis on certain sections of interest. <br />
  45. 45. Good research: Tips and tricks <br />
  46. 46. Be careful of collecting too much data<br />Be aware of reaching the point at which you are no longer learning anything from the data<br />Write up and analyze your data as soon as possible<br />Consider recording the interviews or your own observations/notes<br />Record interviews or focus groups--even your own observations or impressions immediately following the interaction<br />
  47. 47. Besides all the usual good reasons for not reinventing the wheel and using others’ tested surveys, tools, or methods, doing so gives you a point of comparison for your own data<br /><br />When collecting data, talk to the right people <br />Don’t overschedule<br />Be sure to space out interviews, focus sessions, observations or other tactics so that you can get the most from your interactions<br />
  48. 48. Guiding Questions or Next Steps<br />Who are the key stakeholders for the innovative teaching and learning projects in which I am involved? <br />How can I help faculty members communicate the results of their instructional innovations to a) students, b) administrators, and c) their professional communities?<br />What “evidence” indicators do my key stakeholders value most (i.e., grades, satisfaction, retention, others)?<br />Which research professionals or institutional research collection units can assist me in my data collection, analysis and reporting efforts? <br />
  49. 49. Collecting Cases<br />Project Overview<br />Project goals, context, and design <br />Data collection methods <br />Data analysis methods <br />Findings <br />Communication of results <br />Influence on campus practices <br />Reflection on Design, Methodology, and Effectiveness <br />Project setup and design <br />Project data collection and analysis <br />Effectiveness and influence on campus practices <br />Project Contacts <br />Supporting Materials <br />
  50. 50. Online Spring Focus Session<br />April 2011<br /><br />……….<br />Read about the initiative: <br /><br />……….<br />Get involved: <br /><br />……….<br />
  51. 51. Join the ELI Evidence of Impact Constituent Group<br /><br />
  52. 52. SEI Focus Session Content<br />These items for the 2011 Online Spring Focus Session on seeking evidence of impact can be found at <br />ELI Seeking Evidence of Impact Resource List, includes websites, reports, articles, and research: <br />ELI Seeking Evidence of Impact Discussion Questions: <br />ELI Seeking Evidence of Impact Activity Workbook, Day 1 and 2: <br />ELI Seeking Evidence of Impact Reflection Worksheet: <br />Presentation slides and resources for all sessions can be found at<br />
  53. 53. Other Related Resources <br />Focus Session Learning Commons:<br />Full focus session online program:<br />ELI Seeking Evidence of Impact initiative site:<br />Resource site:<br />Suggest an additional resource:<br />Get involved:<br />Contribute:<br />
  54. 54. Contact Information<br />Veronica M. Diaz, PhD<br />Associate Director<br />EDUCAUSE Learning Initiative<br /><br />Copyright Veronica Diaz, 2011. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial, educational purposes, provided that this copyright statement appears on the reproduced materials and notice is given that the copying is by permission of the author. To disseminate otherwise or to republish requires written permission from the author.<br />