Presented at the CAHEA 2017 conference, this presentation provides an overview of a pioneering faculty evaluation and development project in online self-paced courses. Through the combination of best practices modified for online self-paced learning, an instructional quality monitoring program, and elements of instructional coaching MBI-DL has seen great success in reducing student and instructor isolation, increasing instructional quality, and increasing student persistence rates.
Objectives:
1. Outline a model of faculty evaluation and development for online, self-paced courses
2. Demonstrate the impact of faculty evaluation and development on online self-paced instructional quality and student success
3. Provide a summary of lesson learned in implementing this pilot project
History Class XII Ch. 3 Kinship, Caste and Class (1).pptx
Evaluating and Developing Faculty in Online Courses
1. Evaluating and Developing
Faculty in Online Self-
Paced Courses
June 28, 2017
23rd Annual CAHEA Conference - Omaha, NE
Lydia Gillespie and Andrew Beaty
Moody Bible Institute | Distance Learning
2. The Presenters
Lydia Gillespie, MPS
Project Manager
Moody Distance Learning
Instructional Quality Team
Andrew Beaty, Ed.D. (ABD) M.A.C.E
Associate Director of Faculty Development
and Assessment
Moody Distance Learning.
Instructional Quality Team
3. Agenda
1. Outline a model of faculty evaluation and
development for online, self-paced courses
2. Demonstrate the impact of faculty evaluation and
development model
3. Provide a summary of lessons learned
5. Understanding the Problem
Student Experience
● Low Student
Evaluation Scores
● High Student
Attrition Rates
● Suffering from
disengagement
Faculty Behaviors
Lack of engagement in
the course
Glorified graders
No regular contact
with students
Slow grade returning
Administrative Perspective
Money maker vs.
educational
experience
Low value for venue
Lack of understanding
of pedagogical
needs
9. Back to the Basics
What Research Taught Us
Interaction Equivalency Theory
Developed by Dr. Terry Anderson (2003)
Three building blocks for a strong learning
experience:
Student to student interaction
Student to content interaction
Student to teacher interaction
Formative
Learning
Experience
Student -
Content
Interaction
Student -
Teacher
Interaction
Student -
Student
Interaction
10. Back to the Basics
What We Knew
A strong Instructional Presence can:
Increase student engagement
Increase student satisfaction rates
Decrease student attrition rates
Manifest productive/engaged learning environment
Work to accomplish Moody’s mission
Supported by CoI Learning Model (Garrison, Anderson, and Archer 2000)
11. Back to the Basics
What We Knew
A strong Instructional Presence is demonstrated by consistent instructor visibility.
Visibility can look like:
Frequent instructor activity within the course
Regular instructor communication
Grading feedback
(Lehman and Conceicao, 2010)
12. Measuring Performance
OLSP Faculty Expectations
Active in Course Twice a Week
Post Weekly Video Announcements
Return Graded Assignments within Seven Days
14. Evaluating and Developing Performance
OLSP IQ Standards
Communication Flow
Regular emails to OSLP Instructors
Built off the Course Administration Checklist
Intended to remind and encourage instructors toward
expected teaching behaviors
Lorem ipsum dolor sit amet
Sed do eiusmod tempor incididunt ut labore
Deliverable 4
Lorem ipsum dolor sit amet
Sed do eiusmod tempor incididunt ut labore
OLSP IQ Monitoring
Program
Instructional Quality evaluation
Classroom observation 5x throughout term
Instructional Quality graded assessed with rubric
Instructional Quality (IQ) Score calculated for term
OLSP IQ Accountability
Program
Faculty development/coaching program
Coaching email sent 5 times throughout term
Based on data of OLSP IQ Monitoring Program
Data used to inform instructional course assignments
15. Evaluating and Developing Performance
OLSP IQ Standards
Communication Flow
Regular emails to OSLP Instructors
Built off the Course Administration Checklist
Intended to remind and encourage instructors toward
expected teaching behaviors
Lorem ipsum dolor sit amet
Sed do eiusmod tempor incididunt ut labore
Deliverable 4
Lorem ipsum dolor sit amet
Sed do eiusmod tempor incididunt ut labore
OLSP IQ Monitoring
Program
Instructional Quality evaluation
Classroom observation 5x throughout term
Instructional Quality graded assessed with rubric
Instructional Quality (IQ) Score calculated for term
OLSP IQ Accountability
Program
Faculty development/coaching program
Coaching email sent 5 times throughout term
Based on data of OLSP IQ Monitoring Program
Data used to inform instructional course assignments
18. Evaluating and Developing Performance
OLSP IQ Standards
Communication Flow
Regular emails to OSLP Instructors
Built off the Course Administration Checklist
Intended to remind and encourage instructors toward
expected teaching behaviors
Lorem ipsum dolor sit amet
Sed do eiusmod tempor incididunt ut labore
Deliverable 4
Lorem ipsum dolor sit amet
Sed do eiusmod tempor incididunt ut labore
OLSP IQ Monitoring
Program
Instructional Quality evaluation
Classroom observation 5x throughout term
Instructional Quality graded assessed with rubric
Instructional Quality (IQ) Score calculated for term
OLSP IQ Accountability
Program
Faculty development/coaching program
Coaching email sent 5 times throughout term
Based on data of OLSP IQ Monitoring Program
Data used to inform instructional course assignments
20. Measuring the Plan
Instructional Quality Scores
Faculty instructional quality has
improved as the program has
continued.
Highlights:
Fall 2015 to Spring 2016 IQ score
growth: 6.5%
In subsequent terms, the IQ scores
hold steady at an average of 91%
Fall
2015
84.00%
Spring
2016
Summer
2016
Fall
2016
Spring
2017
89.49% 92.57% 90.05% 91.10%
Average Instructional Quality Score
Fall 2015 - Spring 2017
21. Measuring the Plan
Student Evaluation Scores
Students are more satisfied with
the instructional quality
Highlights:
12% increase in Student
Evaluation Scores
8% above minimum
expectation
22.
23. Measuring the Plan
Course Completion Rates
More students are successfully
completing OLSP courses.
Highlights:
19% increase in Course
Completion Rate
3% over the minimum
expectation
Historical Average
(Fall 2010 - Sum 2015)
69%
Current Average
(Fall 2015-Sum 2016)
88%
Minimum
Expectation:
85%
19% Increase
24. Best Practices Learned:
You CAN change the ethos of a program
Faculty engagement is CRITICAL to successful teaching
Even in a self-paced environment, students respond to
faculty engagement
Training and valuing faculty is vital
25. Future Plans
Launching new training course for self-paced instructors
Similar program started in 8-Week courses (beta phase)
Working on “big data” to analyze other trends
27. Contact Us
Lydia Gillespie | Project Manager
Lydia.Gillespie@moody.edu
Andrew Beaty | Associate Director of
Faculty Development and Assessment
Andrew.Beaty@moody.edu
Find this presentation at: http://bit.ly/MoodyCAHEA2017
28. Bibliography
Anderson, T. (2003). Getting the mix right again: An updated and theoretical rationale for
interaction. The International Review of Research in Open and Distance Learning (IRRODL),
4(2).
Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based
environment: Computer conferencing in higher education model. The Internet and Higher
Education, 2(2-3), 87-105.
Lehman, Rosemary M. and Simone C.O. Conceicao. (2010). Creating a Sense of Presence
in Online Teaching. San Francisco: Jossey-Bass.
Editor's Notes
Andrew and Lydia
Andrew:
Go way back to 1901
We were the predecessor to MOOCs...Massive O Online Courses (joke)
Andrew:
Struggle with faculty behaviors
Struggle with mindset of this being a money maker/vs. An educational experience for students
I had told Janet, our boss for a couple years that I KNEW we were going to inherit it at some point and it was going to be UGLY...it WAS!
Our team inherited it in the middle of this time and then we transitioned it to two 16 and one 10 week terms...but the problems still were there.
Andrew
Andrew: Small group interaction and then share one idea with everyone
Lydia:
As we looked to address the recognized issues with our OLSP courses, we took some time to look at research and pedagogical best practices we were already familiar with in our 8 weeks courses. This helped us lay a solid foundation on which we built our improvement efforts.
Lydia:
We first began with research. In our our research, we were looking for solid and proven learning theories as our starting point. We came across Dr. Terry Anderson’s Interaction Equivalency Theory in our research. With a growing body of research, this learning theory provided strong insight into a pedagogical approach for online, self-paced courses. This theory argues that there are three basic building blocks for a strong, formative learning experience: student to student interactions, student to content interactions, and student to teacher interactions. Anderson’ argues that one, of these elements can be removed without compromising a formative learning experience. However, what needs to happen when an element is removed is that the remaining elements needs to increase proportionally to balance out the student’s needs for a formative learning experience.
What does the application of this look like? When it comes to online self-paced courses, the student to student interactions are removed, as this is not a required element of the course. In turn, the student to content and student to teacher interactions needs to equivocally increase in order to make up for the removed student to student interactions.
We recognized that we previously had been eliminating the student to student interaction without offering an equivocal replacement. Our student to content interaction was high, but our student to teaching interaction was very low. Thus, we began to focus our attention on the student-to-teacher interactions. What would strong student-to-teacher interactions look like in the online self-paced environment?
Lydia:
For the second step in laying our foundation, we began reflecting on what we already knew based on our experience in our traditional online course. The learning theory that our Faculty Training Course is built off of is the Community of Inquiry model. From this model, we knew that a strong instructional presence can work to increase student engagement and student satisfaction rates, in turn reducing student attribution rates. We new that this manifests a more productive and engaging learning environment for our students and overall, helped us accomplish Moody’s mission by “richly” equipping our students in the classroom.
Again the questions was raised, given research has told us and what we know, what does a strong instructional presence look like in the online self-paced environment?
Lydia:
When we boiled it down, we understood a strong instructional presence to look like consistent instructor visibility and engagement with students. Practically, we new that visibility and engagement were demonstrated by frequent instructor activity in within the course (such as being active on the community discussion boards, availability for student questions, quick response times), regular instructor initiated communication (like regular announcements, emails, and/or mini-lectures), and grading practices (i.e. timely and formative feedback).
Thus, these components became the building blocks for the development of our OLSP Faculty Expectations--the basis for our instructional quality improvement plan.
Lydia: Knowing that we needed to insure the visibility of our instructors for students we developed the OLSP Faculty Expectations to just that...
...While we have other expectations of our faculty, these became the core, measurable standards we would hold our faculty accountable to as a part of the OLSP IQ Development Plan.
Lydia: With our foundation solid laid, we launched into developing the plan.
Lydia: We structured our program with three core pieces: 1) a communications flow, performance evaluation, and coaching
The OLSP IQ Communication Flow was our opportunity to begin engaging the faculty similar to how we were asking faculty to engage their students. We crafted regular emails to OLSP instructors that went in regular 2-week intervals. This was build off of the Course Administration Checklist—which outlined the tasks for administering an OLSP course, step by step. This was intended to be a connection point with faculty as well as to offer reminders and encouragement for faculty. We received really strong positive feedback from our faculty on this communication plan. They appreciated the engagement and gentle reminders.
Lydia:
The second core element of our improvement efforts was performance evaluation. The OLSP IQ Monitoring Program was the performance evaluation element of our improvement efforts. We were able to accomplish this monitoring employing tools we already had at our fingertips. Utilizing reports generated in Blackboard, our LMS, and classroom observation we were able to evaluate the instructional quality of instructors. We did this in intervals, 5 times throughout a term. Ensuring strong instructional quality was important to us, but we also wanted to be able to measure the instructional quality. To do this, we developed a rubric which helped us to quantify the instructional performance in the class.
Lydia:
This is the rubric we use to measure faculty performance in our online self-paced courses. As you can see, the individual activities we measure--login activity, announcement activity, and grading activity--are outlined in the first column. When you follow each activity across its row, you can find a score on a scale of 1-3 (or 1-4 in the case of the announcements) associated with different levels of performance. The score associated with the performance level was then entered into a tracking spreadsheet.
Lydia:
This is the spreadsheet that the the classroom monitor uses to record the scores from the rubric. It allows us to track performance throughout the term, for each week.
For example, you can see in Week 2 that a two was assigned to the login activities of this hypothetical instructor. According to our rubric, this indicates that the instructor was only active 1 day a week—instead of the required 2 times a week. However, you can see by the 4 in the Announcement column that a video announcement was posted and that the instructor has 0 graded items that were older than 7 days.
Furthermore, you can see at the bottom of this spreadsheet that we were able to aggregate the score for each week’s performance in the various categories and then generate an overall instructional quality score from those totals. This allowed us to measure instructional performance and gave us tractable and comparable data.
Lydia:
The third and final component of our improvement efforts was the OLSP IQ Accountability Program. This was a faculty development/coaching program. Based on the data from the OLSP IQ Monitoring Program, we were able to reach out to instructors based on their performance level. Using principles of coaching and mentoring we were able to help faculty overcoming barriers to their performance and thus increase their instructional quality. The coaching was generally done via email and followed the same pattern of the OLSP IQ Monitoring Program of 5x a term.
Lydia
Lydia
Lydia
Lydia: Overall, IQ Scores and EOC scores showed similar trends--thus confirming our standards.