Evaluating and Developing
Faculty in Online Self-
Paced Courses
June 28, 2017
23rd Annual CAHEA Conference - Omaha, NE
Lydia Gillespie and Andrew Beaty
Moody Bible Institute | Distance Learning
The Presenters
Lydia Gillespie, MPS
Project Manager
Moody Distance Learning
Instructional Quality Team
Andrew Beaty, Ed.D. (ABD) M.A.C.E
Associate Director of Faculty Development
and Assessment
Moody Distance Learning.
Instructional Quality Team
Agenda
1. Outline a model of faculty evaluation and
development for online, self-paced courses
2. Demonstrate the impact of faculty evaluation and
development model
3. Provide a summary of lessons learned
Overview
Understanding the Problem
Student Experience
● Low Student
Evaluation Scores
● High Student
Attrition Rates
● Suffering from
disengagement
Faculty Behaviors
Lack of engagement in
the course
Glorified graders
No regular contact
with students
Slow grade returning
Administrative Perspective
Money maker vs.
educational
experience
Low value for venue
Lack of understanding
of pedagogical
needs
Objectives:
Shift perspectives
Increase student engagement
Increased instructional engagement
POP QUIZ
How would you fix
this situation?
Laying the Foundation
Back to the Basics
What Research Taught Us
Interaction Equivalency Theory
Developed by Dr. Terry Anderson (2003)
Three building blocks for a strong learning
experience:
Student to student interaction
Student to content interaction
Student to teacher interaction
Formative
Learning
Experience
Student -
Content
Interaction
Student -
Teacher
Interaction
Student -
Student
Interaction
Back to the Basics
What We Knew
A strong Instructional Presence can:
Increase student engagement
Increase student satisfaction rates
Decrease student attrition rates
Manifest productive/engaged learning environment
Work to accomplish Moody’s mission
Supported by CoI Learning Model (Garrison, Anderson, and Archer 2000)
Back to the Basics
What We Knew
A strong Instructional Presence is demonstrated by consistent instructor visibility.
Visibility can look like:
Frequent instructor activity within the course
Regular instructor communication
Grading feedback
(Lehman and Conceicao, 2010)
Measuring Performance
OLSP Faculty Expectations
Active in Course Twice a Week
Post Weekly Video Announcements
Return Graded Assignments within Seven Days
The Plan
Evaluating and Developing Performance
OLSP IQ Standards
Communication Flow
Regular emails to OSLP Instructors
Built off the Course Administration Checklist
Intended to remind and encourage instructors toward
expected teaching behaviors
Lorem ipsum dolor sit amet
Sed do eiusmod tempor incididunt ut labore
Deliverable 4
Lorem ipsum dolor sit amet
Sed do eiusmod tempor incididunt ut labore
OLSP IQ Monitoring
Program
Instructional Quality evaluation
Classroom observation 5x throughout term
Instructional Quality graded assessed with rubric
Instructional Quality (IQ) Score calculated for term
OLSP IQ Accountability
Program
Faculty development/coaching program
Coaching email sent 5 times throughout term
Based on data of OLSP IQ Monitoring Program
Data used to inform instructional course assignments
Evaluating and Developing Performance
OLSP IQ Standards
Communication Flow
Regular emails to OSLP Instructors
Built off the Course Administration Checklist
Intended to remind and encourage instructors toward
expected teaching behaviors
Lorem ipsum dolor sit amet
Sed do eiusmod tempor incididunt ut labore
Deliverable 4
Lorem ipsum dolor sit amet
Sed do eiusmod tempor incididunt ut labore
OLSP IQ Monitoring
Program
Instructional Quality evaluation
Classroom observation 5x throughout term
Instructional Quality graded assessed with rubric
Instructional Quality (IQ) Score calculated for term
OLSP IQ Accountability
Program
Faculty development/coaching program
Coaching email sent 5 times throughout term
Based on data of OLSP IQ Monitoring Program
Data used to inform instructional course assignments
OLSP Instructional Quality Rubric
OLSP Instructional Quality Monitoring Spreadsheet
Evaluating and Developing Performance
OLSP IQ Standards
Communication Flow
Regular emails to OSLP Instructors
Built off the Course Administration Checklist
Intended to remind and encourage instructors toward
expected teaching behaviors
Lorem ipsum dolor sit amet
Sed do eiusmod tempor incididunt ut labore
Deliverable 4
Lorem ipsum dolor sit amet
Sed do eiusmod tempor incididunt ut labore
OLSP IQ Monitoring
Program
Instructional Quality evaluation
Classroom observation 5x throughout term
Instructional Quality graded assessed with rubric
Instructional Quality (IQ) Score calculated for term
OLSP IQ Accountability
Program
Faculty development/coaching program
Coaching email sent 5 times throughout term
Based on data of OLSP IQ Monitoring Program
Data used to inform instructional course assignments
The Outcome
Measuring the Plan
Instructional Quality Scores
Faculty instructional quality has
improved as the program has
continued.
Highlights:
Fall 2015 to Spring 2016 IQ score
growth: 6.5%
In subsequent terms, the IQ scores
hold steady at an average of 91%
Fall
2015
84.00%
Spring
2016
Summer
2016
Fall
2016
Spring
2017
89.49% 92.57% 90.05% 91.10%
Average Instructional Quality Score
Fall 2015 - Spring 2017
Measuring the Plan
Student Evaluation Scores
Students are more satisfied with
the instructional quality
Highlights:
12% increase in Student
Evaluation Scores
8% above minimum
expectation
Measuring the Plan
Course Completion Rates
More students are successfully
completing OLSP courses.
Highlights:
19% increase in Course
Completion Rate
3% over the minimum
expectation
Historical Average
(Fall 2010 - Sum 2015)
69%
Current Average
(Fall 2015-Sum 2016)
88%
Minimum
Expectation:
85%
19% Increase
Best Practices Learned:
You CAN change the ethos of a program
Faculty engagement is CRITICAL to successful teaching
Even in a self-paced environment, students respond to
faculty engagement
Training and valuing faculty is vital
Future Plans
Launching new training course for self-paced instructors
Similar program started in 8-Week courses (beta phase)
Working on “big data” to analyze other trends
Questions?
Contact Us
Lydia Gillespie | Project Manager
Lydia.Gillespie@moody.edu
Andrew Beaty | Associate Director of
Faculty Development and Assessment
Andrew.Beaty@moody.edu
Find this presentation at: http://bit.ly/MoodyCAHEA2017
Bibliography
Anderson, T. (2003). Getting the mix right again: An updated and theoretical rationale for
interaction. The International Review of Research in Open and Distance Learning (IRRODL),
4(2).
Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based
environment: Computer conferencing in higher education model. The Internet and Higher
Education, 2(2-3), 87-105.
Lehman, Rosemary M. and Simone C.O. Conceicao. (2010). Creating a Sense of Presence
in Online Teaching. San Francisco: Jossey-Bass.

OLSP Faculty Evaluation and Development

  • 1.
    Evaluating and Developing Facultyin Online Self- Paced Courses June 28, 2017 23rd Annual CAHEA Conference - Omaha, NE Lydia Gillespie and Andrew Beaty Moody Bible Institute | Distance Learning
  • 2.
    The Presenters Lydia Gillespie,MPS Project Manager Moody Distance Learning Instructional Quality Team Andrew Beaty, Ed.D. (ABD) M.A.C.E Associate Director of Faculty Development and Assessment Moody Distance Learning. Instructional Quality Team
  • 3.
    Agenda 1. Outline amodel of faculty evaluation and development for online, self-paced courses 2. Demonstrate the impact of faculty evaluation and development model 3. Provide a summary of lessons learned
  • 4.
  • 5.
    Understanding the Problem StudentExperience ● Low Student Evaluation Scores ● High Student Attrition Rates ● Suffering from disengagement Faculty Behaviors Lack of engagement in the course Glorified graders No regular contact with students Slow grade returning Administrative Perspective Money maker vs. educational experience Low value for venue Lack of understanding of pedagogical needs
  • 6.
    Objectives: Shift perspectives Increase studentengagement Increased instructional engagement
  • 7.
    POP QUIZ How wouldyou fix this situation?
  • 8.
  • 9.
    Back to theBasics What Research Taught Us Interaction Equivalency Theory Developed by Dr. Terry Anderson (2003) Three building blocks for a strong learning experience: Student to student interaction Student to content interaction Student to teacher interaction Formative Learning Experience Student - Content Interaction Student - Teacher Interaction Student - Student Interaction
  • 10.
    Back to theBasics What We Knew A strong Instructional Presence can: Increase student engagement Increase student satisfaction rates Decrease student attrition rates Manifest productive/engaged learning environment Work to accomplish Moody’s mission Supported by CoI Learning Model (Garrison, Anderson, and Archer 2000)
  • 11.
    Back to theBasics What We Knew A strong Instructional Presence is demonstrated by consistent instructor visibility. Visibility can look like: Frequent instructor activity within the course Regular instructor communication Grading feedback (Lehman and Conceicao, 2010)
  • 12.
    Measuring Performance OLSP FacultyExpectations Active in Course Twice a Week Post Weekly Video Announcements Return Graded Assignments within Seven Days
  • 13.
  • 14.
    Evaluating and DevelopingPerformance OLSP IQ Standards Communication Flow Regular emails to OSLP Instructors Built off the Course Administration Checklist Intended to remind and encourage instructors toward expected teaching behaviors Lorem ipsum dolor sit amet Sed do eiusmod tempor incididunt ut labore Deliverable 4 Lorem ipsum dolor sit amet Sed do eiusmod tempor incididunt ut labore OLSP IQ Monitoring Program Instructional Quality evaluation Classroom observation 5x throughout term Instructional Quality graded assessed with rubric Instructional Quality (IQ) Score calculated for term OLSP IQ Accountability Program Faculty development/coaching program Coaching email sent 5 times throughout term Based on data of OLSP IQ Monitoring Program Data used to inform instructional course assignments
  • 15.
    Evaluating and DevelopingPerformance OLSP IQ Standards Communication Flow Regular emails to OSLP Instructors Built off the Course Administration Checklist Intended to remind and encourage instructors toward expected teaching behaviors Lorem ipsum dolor sit amet Sed do eiusmod tempor incididunt ut labore Deliverable 4 Lorem ipsum dolor sit amet Sed do eiusmod tempor incididunt ut labore OLSP IQ Monitoring Program Instructional Quality evaluation Classroom observation 5x throughout term Instructional Quality graded assessed with rubric Instructional Quality (IQ) Score calculated for term OLSP IQ Accountability Program Faculty development/coaching program Coaching email sent 5 times throughout term Based on data of OLSP IQ Monitoring Program Data used to inform instructional course assignments
  • 16.
  • 17.
    OLSP Instructional QualityMonitoring Spreadsheet
  • 18.
    Evaluating and DevelopingPerformance OLSP IQ Standards Communication Flow Regular emails to OSLP Instructors Built off the Course Administration Checklist Intended to remind and encourage instructors toward expected teaching behaviors Lorem ipsum dolor sit amet Sed do eiusmod tempor incididunt ut labore Deliverable 4 Lorem ipsum dolor sit amet Sed do eiusmod tempor incididunt ut labore OLSP IQ Monitoring Program Instructional Quality evaluation Classroom observation 5x throughout term Instructional Quality graded assessed with rubric Instructional Quality (IQ) Score calculated for term OLSP IQ Accountability Program Faculty development/coaching program Coaching email sent 5 times throughout term Based on data of OLSP IQ Monitoring Program Data used to inform instructional course assignments
  • 19.
  • 20.
    Measuring the Plan InstructionalQuality Scores Faculty instructional quality has improved as the program has continued. Highlights: Fall 2015 to Spring 2016 IQ score growth: 6.5% In subsequent terms, the IQ scores hold steady at an average of 91% Fall 2015 84.00% Spring 2016 Summer 2016 Fall 2016 Spring 2017 89.49% 92.57% 90.05% 91.10% Average Instructional Quality Score Fall 2015 - Spring 2017
  • 21.
    Measuring the Plan StudentEvaluation Scores Students are more satisfied with the instructional quality Highlights: 12% increase in Student Evaluation Scores 8% above minimum expectation
  • 23.
    Measuring the Plan CourseCompletion Rates More students are successfully completing OLSP courses. Highlights: 19% increase in Course Completion Rate 3% over the minimum expectation Historical Average (Fall 2010 - Sum 2015) 69% Current Average (Fall 2015-Sum 2016) 88% Minimum Expectation: 85% 19% Increase
  • 24.
    Best Practices Learned: YouCAN change the ethos of a program Faculty engagement is CRITICAL to successful teaching Even in a self-paced environment, students respond to faculty engagement Training and valuing faculty is vital
  • 25.
    Future Plans Launching newtraining course for self-paced instructors Similar program started in 8-Week courses (beta phase) Working on “big data” to analyze other trends
  • 26.
  • 27.
    Contact Us Lydia Gillespie| Project Manager Lydia.Gillespie@moody.edu Andrew Beaty | Associate Director of Faculty Development and Assessment Andrew.Beaty@moody.edu Find this presentation at: http://bit.ly/MoodyCAHEA2017
  • 28.
    Bibliography Anderson, T. (2003).Getting the mix right again: An updated and theoretical rationale for interaction. The International Review of Research in Open and Distance Learning (IRRODL), 4(2). Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education model. The Internet and Higher Education, 2(2-3), 87-105. Lehman, Rosemary M. and Simone C.O. Conceicao. (2010). Creating a Sense of Presence in Online Teaching. San Francisco: Jossey-Bass.

Editor's Notes

  • #3 Andrew and Lydia
  • #5 Andrew: Go way back to 1901 We were the predecessor to MOOCs...Massive O Online Courses (joke)
  • #6 Andrew: Struggle with faculty behaviors Struggle with mindset of this being a money maker/vs. An educational experience for students I had told Janet, our boss for a couple years that I KNEW we were going to inherit it at some point and it was going to be UGLY...it WAS! Our team inherited it in the middle of this time and then we transitioned it to two 16 and one 10 week terms...but the problems still were there.
  • #7 Andrew
  • #8 Andrew: Small group interaction and then share one idea with everyone
  • #9 Lydia: As we looked to address the recognized issues with our OLSP courses, we took some time to look at research and pedagogical best practices we were already familiar with in our 8 weeks courses. This helped us lay a solid foundation on which we built our improvement efforts.
  • #10 Lydia: We first began with research. In our our research, we were looking for solid and proven learning theories as our starting point. We came across Dr. Terry Anderson’s Interaction Equivalency Theory in our research. With a growing body of research, this learning theory provided strong insight into a pedagogical approach for online, self-paced courses. This theory argues that there are three basic building blocks for a strong, formative learning experience: student to student interactions, student to content interactions, and student to teacher interactions. Anderson’ argues that one, of these elements can be removed without compromising a formative learning experience. However, what needs to happen when an element is removed is that the remaining elements needs to increase proportionally to balance out the student’s needs for a formative learning experience. What does the application of this look like? When it comes to online self-paced courses, the student to student interactions are removed, as this is not a required element of the course. In turn, the student to content and student to teacher interactions needs to equivocally increase in order to make up for the removed student to student interactions. We recognized that we previously had been eliminating the student to student interaction without offering an equivocal replacement. Our student to content interaction was high, but our student to teaching interaction was very low. Thus, we began to focus our attention on the student-to-teacher interactions. What would strong student-to-teacher interactions look like in the online self-paced environment?
  • #11 Lydia: For the second step in laying our foundation, we began reflecting on what we already knew based on our experience in our traditional online course. The learning theory that our Faculty Training Course is built off of is the Community of Inquiry model. From this model, we knew that a strong instructional presence can work to increase student engagement and student satisfaction rates, in turn reducing student attribution rates. We new that this manifests a more productive and engaging learning environment for our students and overall, helped us accomplish Moody’s mission by “richly” equipping our students in the classroom. Again the questions was raised, given research has told us and what we know, what does a strong instructional presence look like in the online self-paced environment?
  • #12 Lydia: When we boiled it down, we understood a strong instructional presence to look like consistent instructor visibility and engagement with students. Practically, we new that visibility and engagement were demonstrated by frequent instructor activity in within the course (such as being active on the community discussion boards, availability for student questions, quick response times), regular instructor initiated communication (like regular announcements, emails, and/or mini-lectures), and grading practices (i.e. timely and formative feedback). Thus, these components became the building blocks for the development of our OLSP Faculty Expectations--the basis for our instructional quality improvement plan.
  • #13 Lydia: Knowing that we needed to insure the visibility of our instructors for students we developed the OLSP Faculty Expectations to just that... ...While we have other expectations of our faculty, these became the core, measurable standards we would hold our faculty accountable to as a part of the OLSP IQ Development Plan.
  • #14 Lydia: With our foundation solid laid, we launched into developing the plan.
  • #15 Lydia: We structured our program with three core pieces: 1) a communications flow, performance evaluation, and coaching The OLSP IQ Communication Flow was our opportunity to begin engaging the faculty similar to how we were asking faculty to engage their students. We crafted regular emails to OLSP instructors that went in regular 2-week intervals. This was build off of the Course Administration Checklist—which outlined the tasks for administering an OLSP course, step by step. This was intended to be a connection point with faculty as well as to offer reminders and encouragement for faculty. We received really strong positive feedback from our faculty on this communication plan. They appreciated the engagement and gentle reminders.
  • #16 Lydia: The second core element of our improvement efforts was performance evaluation. The OLSP IQ Monitoring Program was the performance evaluation element of our improvement efforts. We were able to accomplish this monitoring employing tools we already had at our fingertips. Utilizing reports generated in Blackboard, our LMS, and classroom observation we were able to evaluate the instructional quality of instructors. We did this in intervals, 5 times throughout a term. Ensuring strong instructional quality was important to us, but we also wanted to be able to measure the instructional quality. To do this, we developed a rubric which helped us to quantify the instructional performance in the class.
  • #17 Lydia: This is the rubric we use to measure faculty performance in our online self-paced courses. As you can see, the individual activities we measure--login activity, announcement activity, and grading activity--are outlined in the first column. When you follow each activity across its row, you can find a score on a scale of 1-3 (or 1-4 in the case of the announcements) associated with different levels of performance. The score associated with the performance level was then entered into a tracking spreadsheet.
  • #18 Lydia: This is the spreadsheet that the the classroom monitor uses to record the scores from the rubric. It allows us to track performance throughout the term, for each week. For example, you can see in Week 2 that a two was assigned to the login activities of this hypothetical instructor. According to our rubric, this indicates that the instructor was only active 1 day a week—instead of the required 2 times a week. However, you can see by the 4 in the Announcement column that a video announcement was posted and that the instructor has 0 graded items that were older than 7 days. Furthermore, you can see at the bottom of this spreadsheet that we were able to aggregate the score for each week’s performance in the various categories and then generate an overall instructional quality score from those totals. This allowed us to measure instructional performance and gave us tractable and comparable data.
  • #19 Lydia: The third and final component of our improvement efforts was the OLSP IQ Accountability Program. This was a faculty development/coaching program. Based on the data from the OLSP IQ Monitoring Program, we were able to reach out to instructors based on their performance level. Using principles of coaching and mentoring we were able to help faculty overcoming barriers to their performance and thus increase their instructional quality. The coaching was generally done via email and followed the same pattern of the OLSP IQ Monitoring Program of 5x a term.
  • #20 Lydia
  • #21 Lydia
  • #22 Lydia
  • #23 Lydia: Overall, IQ Scores and EOC scores showed similar trends--thus confirming our standards.
  • #24 Lydia
  • #25 Andrew and Lydia
  • #26 Andrew and Lydia
  • #27 Andrew and Lydia