• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Rooted in Research: Establishing Coherent Partnerships between Institutional Research and the Quality Enhancement Plan
 

Rooted in Research: Establishing Coherent Partnerships between Institutional Research and the Quality Enhancement Plan

on

  • 135 views

 

Statistics

Views

Total Views
135
Views on SlideShare
135
Embed Views
0

Actions

Likes
0
Downloads
0
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Rooted in Research: Establishing Coherent Partnerships between Institutional Research and the Quality Enhancement Plan Rooted in Research: Establishing Coherent Partnerships between Institutional Research and the Quality Enhancement Plan Presentation Transcript

    • Establishing Coherent Partnerships between Institutional Research and the Quality Enhancement PlanDr. Ghazala Hashmi, Coordinator of the Quality Enhancement Plan Dr. Jackie Bourque, Director, Office of Institutional Effectiveness J. Sargeant Reynolds Community College, Richmond, Virginia SACS Annual Conference  Orlando, Florida  December 2011
    • The Session’s Goals• Present J. Sargeant Reynolds Community College’s use of institutional research to 1) identify, develop, and implement its QEP, and 2) bring a variety of college units into effective partnerships.• Share the challenges and the successes the College has encountered in gathering data and in making effective use of data to improve student learning in distance education.
    • Outcomes of this SessionBy the end of this session we hope you will be able to• Identify best practices in effectively gathering and using institutional data points for the selection, development, and implementation of a QEP topic.• Develop effective, collaborative partnerships between the QEP Team and the Office of Institutional Research.• Share with your colleagues templates for gathering the grounding institutional data that helps  to guide QEP selection teams towards an effective QEP topic,  to develop the substance of the QEP, and  to assess the effectiveness of the QEP during implementation.
    • The Ripple Effect: Transforming Student Success in Distance Learning, One Student & One Instructor at a Time The Quality Enhancement Plan at J. Sargeant Reynolds Community College Student Orientation & Student Support Student Faculty Readiness Training Student SuccessThe Ripple Effect in Distance Learning
    • QEP Topic Selection Digging into Data
    • Data Profile Template
    • Growth in Number of Distance Learning Enrollments 1999-2000 2000-01 2001-02 2002-03 2003-04 2004-05 2005-06 2006-07 2007-08 Summer 751 856 984 1079 1282 1529 1979 2132 2437 Fall 1049 1140 1223 1584 1814 2175 2524 2817 3414 Spring 1085 1189 1463 1680 2086 2462 2710 2802 3393 TOTAL 2885 3185 3670 4343 5182 6166 7203 7768 924410000 9000 8000 7000 6000 5000 Summer 4000 Fall 3000 2000 Spring 1000 TOTAL 0
    • Growth in Number of Distance Learning Sections 1999-2000 2000-01 2001-02 2002-03 2003-04 2004-05 2005-06 2006-07 2007-08 142 144 Summer 65 75 83 80 97 109 136 193 176 Fall 83 108 111 115 138 148 167 198 185 Spring 95 103 119 121 139 166 172 502 534 TOTAL 243 286 313 316 374 423 475600500400300 Summer200 Fall100 Spring 0 TOTAL
    • Increasing Numbers of Faculty Teaching Distance Learning Sections1201008060 Full-Time Part-Time4020 0 2003 - 2004 2006 - 2007 2008 - 2009 2009 - 2010
    • Student Success Rates in DL Classes Original Template
    • Student Success Rates in Face-to-Face Classes Original Template
    • Student Persistence Rates in Face-to-Face versus Distance Learning Classes An Example of the New Data Template On-Campus, Face-to-Face Distance Learning Discipline/ Domain Persisted Withdrew Persisted Withdrew N % N % N % N % p< .10 DisadvantageACC -Accounting ACC 115 86 88.7% 11 11.3% 57 76.0% 18 24.0% * Online ACC 211 382 85.3% 66 14.7% 147 90.2% 16 9.8% ACC 212 272 95.4% 13 4.6% 131 94.9% 7 5.1%ADJ –Administrationof Justice ADJ 100 33 100.0% 0 0.0% 33 97.1% 1 2.9% ADJ 105 31 86.1% 5 13.9% 60 89.6% 7 10.4% ADJ 201 33 100.0% 0 0.0% 29 93.5% 2 6.5%ART - Art ART 100 57 89.1% 7 10.9% 330 88.7% 42 11.3% ART 101 37 92.5% 3 7.5% 29 70.7% 12 29.3% * Online ART 102 17 70.8% 7 29.2% 55 93.2% 4 6.8% * Regular ART 106 17 94.4% 1 5.6% 41 97.6% 1 2.4%ASL – AmericanSign Language ASL 201 38 90.5% 4 9.5% 8 100.0% 0 0.0%BIO - Biology BIO 101 1051 91.9% 93 8.1% 98 91.6% 9 8.4% BIO 106 113 91.1% 11 8.9% 53 82.8% 11 17.2% * Online
    • QEP DevelopmentDefining a Data-driven Plan
    • Growing the PlanExternal ResearchBest practices: Identifying national standardsrelevant to our QEP TopicOther institutional efforts: Identifying otherinstitutional efforts, particularly at compatiblecolleges, similar to our ownA review of the literature: Developing annotatedbibliographies in order to present an academicreview of the material
    • Growing the PlanInternal ResearchEvaluating topical data:Distance Learning Student Survey and Report – what do our students think aboutcourse design; who are our online students; what are the barriers to their success?Discipline Review of Online Courses demonstrating broad gaps between face-to-faceand online student success ratesConsulting faculty:Distance Learning Faculty Focus Group – what are their perceptions; what do theyconsider to be the barriers to student success?Faculty training needs – evaluating needs in technology training and instruction inpedagogy, course design, assessment of student learning.Evaluating general college data:Continued evaluation of general college data: enrollments, success rates, persistencerates
    • An Example of Data Collected and Evaluated: FTES Comparison - Fall 2009 with Fall 2008 by campusCampus Fall 2008 Fall 2009 FTES % 10/20/08 10/19/09 Change ChangeCampus One 3,060.93 3,375.27 314.33 10.27Campus Two 1,692.20 1,867.67 175.47 10.37Campus Three 228.93 250.87 21.93 9.58Off-Campus 12.00 9.60 -2.40 -20.00Off-Campus 830.60 482.27 -348.33 -41.94Virtual 792.00 1,056.20 264.20 33.36Unknown 12.40 0.00 -12.40 -100.00Total 6,629.07 7,041.87 412.80 6.23
    • QEP Implementation Driving, Detouring, and Documenting
    • The Driver: the QEP Assessment PlanUsing the Emerging Data to Drive the PlanStudent Readiness –1. A student profile emerges through SmarterMeasure.2. We evaluate the relationships between this profile and student success.3. We construct our distance learning orientation around institutional data.Student Orientation –1. Students and faculty provide qualitative feedback.2. We evaluate success of orientation by measuring impact on students.3. We modify the orientation based on data
    • The Driver: the QEP Assessment Plan (continued)Faculty Training –1) Faculty provide self-assessments of their own skills and understanding of course design and online teaching2) Ongoing peer-to-peer reviews provide qualitative and quantitative data.3) We evaluate student success and student persistence rates of trained and untrained faculty.4) Faculty provide feedback about the impact of the training.5) We assess our own training services through the modules that have been designed and delivered.6) The QEP Team makes modifications based on the results of the data.
    • Partnerships in Institutional Research Executive Vice President QEP Office of Center for Coordinator Institutional Distance Faculty Member Effectiveness Learning with online • Research Analyst experience Office of Office of Student Affairs Academic Affairs QEP Assistant Coordinator Office of Technology Professional Training Development
    • The Detours Assessments leads into new sprouts. Leaving room for growth and expansion into new territories has been both challenging and rewarding. Peer Academic Leaders (PALs) Program Rewards and Recognitions ProgramNew Faculty Certification for Distance Learning Policy New Communications Tools The New Faculty Development Database
    • Peer Academic Leaders (PALs) ProgramAlthough PALs developed from the QEP and froma limited, on-campus program, it presented newchallenges: • Funding • Recruitment and Training of Peer Leaders • Administrative Oversight • New Marketing • New Assessment Tools and Activities
    • New Communications ToolsThe QEP Blog
    • A New Faculty Development Databasethat also integrates HRMSIS and the Knowledge Center Fall 2010 StatisticsNumber of Current Number of Current Number of CurrentDistance Learning Faculty Distance Learning Faculty Distance Learning Faculty who have completed who have not completed either Tier One or Tier Two either Tier One or Tier Two Training Training 153 48 105120100 Number of Current DL80 Faculty who have completed either TOP or60 IDOL40 Number of Current DL Faculty who have not20 completed either TOP or 0 IDOL Current Distance Learning Faculty
    • The Documentation using data to support QEP initiatives• Reporting to College Executives: WEAVE Online• Reporting to Broader QEP Team: SharePoint• Annual Reports to general college audience• Regular summaries and reports of ongoing assessment efforts and results on public blog• Ongoing presentations to internal audiences
    • Digging, Driving, DocumentingIn summary, we have found that the ongoingeffectiveness of and enthusiasm for the QEP isbuilt upon three primary factors, and they allrelate to the research of the QEP:Digging into the data (gathering, evaluating, discussing)Driving with data as the guide (building, daydreaming,detouring)Documenting the data (communicating and sharing)
    • for more informationJackie BourqueDirector, Office of Institutional Effectivenessjbourque@reynolds.eduGhazala HashmiCoordinator, Quality Enhancement Planghashmi@reynolds.eduwww.reynolds.edu/qep