Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Department IE Assessment Process


Published on

This presentation focuses on university and college effectiveness (IE): “The effort to make certain the institution makes progress on its goals, accreditation and strategic plan.”

Published in: Education, Technology, Business
  • Be the first to comment

  • Be the first to like this

Department IE Assessment Process

  1. 1. An Overview of the Institutional EffectivenessProcess<br />Kelly McMichael, Ed.D.<br />
  2. 2. Institutional Effectiveness<br />“To establish and sustain a data-driven and documented system of evaluation and assessment that will ensure the University and its branches perform in a manner that is always improving its educational and operational quality.”<br />
  3. 3. Institutional Effectiveness<br />“The effort to make certain the institution makes progress on its goals, accreditation and strategic plan.”<br />
  4. 4. Institutional Effectiveness<br />
  5. 5. Institutional Effectiveness<br />“Anything worth doing is worth measuring.”<br />“You get what you measure.”<br />“You expect what you inspect.”<br />
  6. 6. IE Assessment Process<br />Identification<br />(Outcomes and standards are collectively developed and criteria quantified)<br />“What’s important to measure?”<br />7. Documentation<br />(Minutes and records are kept through meetings and archived using the IE Report template for future reference)<br />“How and where do we keep data and changes?”<br />2. Instrumentation<br />(Rubrics, tests, records, surveys, tracking programs developed with reliability & validity)<br />“Do we have valid tools to use in measuring?”<br />6. Utilization<br />(Necessary changes are suggested and being made)<br />“What do we do with the results? What needs to change?”<br />5. Dissemination<br />(Results are sent to key repositories)<br />“Who needs the data and results?”<br />3. Implementation<br />(Consistently and comprehensively performed)<br />“How and when will we measure?”<br />4. Analyzation<br />(Data is computed, summarized, and easily understood)<br />“What does the data mean?”<br />
  7. 7. Instruments in Use<br />We need to ensure reliability (consistency of measurements) and validity (capturing sound intended data through questions that support outcomes and goals). <br />CAPS course database<br />Portfolio Review rubrics<br />IDEA course surveys <br />Student Satisfaction Inventory<br />Annual Employee Survey<br />Employee training evaluations<br />Accuplacer Test <br />WritePlacer Test<br />ETS Proficiency Profile<br />Alumni surveys<br />Capstone course rubrics<br />Internship surveys<br />Committee minutes<br />Help Desk Log<br />College Success (FS101) surveys<br />
  8. 8. Portfolio Reviews<br />One particularly valuable and direct measure of student learning occurs when academic departments assess student learning outcomes at the 3rd, 6th, 9th and 12th quarters of student performance in the program. At these milestones key student learning outcomes are evaluated using specially developed programmatic rubrics. <br />The rubrics are applied by a panel of faculty evaluators and the University uses their findings to assist students in making improvements in presentation and content mastery. <br />Repeated sub-performance in multiple reviews by multiple students may indicate needed course corrections in program curriculum and instructional methodology. <br />
  9. 9. Examples of Data-Administration<br />
  10. 10. Examples of Data-Academic<br />
  11. 11. The Nichols’ Template<br />
  12. 12. The Nichols’ Template<br />
  13. 13. Oversight Groups<br />The University has established (appointment by the University’s President) three domain-specific oversight groups to conduct reviews of the Institutional Effectiveness Reports. These reviews occur annually prior to kick-off of the Strategic Planning and Budgeting Processes. The oversight groups are charged with the following key tasks:<br /> (1) evaluate whether the institution successfully achieved its desired outcomes from the previous institutional effectiveness and planning cycle, <br />(2) identify key areas requiring improvement that were identified in the institutional effectiveness reports’ through analysis, and <br />(3) develop strategies and recommendations for quality improvement initiatives for the next strategic planning and budgeting cycle. <br />
  14. 14. Oversight Groups<br /><ul><li>The Academic Oversight Group reviews key direct and indirect measures of educational outcomes.
  15. 15. The Educational Support Services Oversight Group reviews measures of assessment data and other key information concerning the effectiveness of student services.
  16. 16. The Administrative Services Oversight Group reviews measures of assessment data and other key information concerning the effectiveness of administrative services and the effectiveness of using technology at the University. </li></li></ul><li>Strategic Planning<br />