Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Rubric to evaluate online course syllabi plans for engendering a COI: Round II

437 views

Published on

We replicated a research study that analyzed online course syllabi with the Online Community of Inquiry (COI) Syllabus Rubric© (Rogers & Van Haneghan, 2016). The rubric consists of the following elements: instructional design for cognitive presence, technology tools for COI, COI loop for social presence, support for learner characteristics, and instruction and feedback for teaching presence. We reviewed 31 syllabi across disciplines and found above average cognitive presence, average social presence, and basic teaching presence. I presented our research at the Association for Educational Communications & Technology (AECT) 2018 conference in Kansas City, MO.

Published in: Education
  • Discover a WEIRD trick I use to make over $3500 per month taking paid surveys online. read more... ●●● https://tinyurl.com/make2793amonth
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • Earn $500 for taking a 1 hour paid survey! read more... ♥♥♥ http://ishbv.com/surveys6/pdf
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • Be the first to like this

Rubric to evaluate online course syllabi plans for engendering a COI: Round II

  1. 1. AECT 2018 Rubric to Evaluate Online Course Syllabi Plans for Engendering a COI: Round II Dr. Sandra Rogers, Spring Hill College Dr. Sam Khoury, Spring Hill College
  2. 2. Rationale for Syllabus Review  Starting point  Plan of action (syllabus, schedule, rubrics, & guiding documents)  Readily accessible  Potential level of online COI (low, moderate, or high) We recognize that it’s not an exact plan of what goes on in the actual online course.
  3. 3. Community of Inquiry (COI) Garrison, Anderson, & Archer, 2000
  4. 4. COI Framework Defined Co-construction of meaning via shared learning experiences to engender student agency through connectedness (social presence); Learning activities engage higher-order thinking skills (cognitive presence); & Instruction & feedback meet the needs of all learners (teaching presence).
  5. 5. Cognitive Presence Social Presence Teaching Presence Bloom’s (1956) taxonomy for learning objectives; QM 2.5, (2014) Suitability of learning objectives Lack of S-S interactions decrease student satisfaction & achievement (Bernard et al., 2009; Granitz & Green, 2003). Instructor immediacy behaviors strongly correlate with student satisfaction & achievement (Arbaugh, 2001; Baker, 2010). Respect diverse talents & ways of learning from Chickering & Gamson’s 7 Principles of Good Practice F2F applied to elearning (Arbaugh & Hornik, 2006); QM 7.1- 7.4, Learner Support Lack of synchronous sessions leads to reduced achievement (Bernard et al., 2009). Encourage contact between S- T & give prompt feedback from Chickering & Gamson’s Principles applied to elearning (Arbaugh & Hornik). QM 3.5, (2014) Provide learners with multiple opportunities to track learning. E-communication promote S-S & S-T interactions to close the psychological distance in elearning (Lemak, Shin, Reed, & Montgomery, 2007; Moore & Kearsley, 1996); QM 4.5, (2014) Variety of instructional materials Lack of multiple forms of communication with & between students causes dissatisfaction (Granitz & Green, 2003); QM 5.3, (2014) Classroom response time & feedback on assignments are timely.
  6. 6. Online COI Syllabus Rubric (OCOISR) ID for Cognitive Presence- Instructional design offers extensive cognitive activities such as exploration, integration, resolution, & triggering events (analysis, synthesis, evaluation). Support for Learner Characteristics- Extensive learner support and available resources are identified (e.g., disability services, remedial services, strategies/tips, & scaffolding of assignments).
  7. 7. Exemplary Rubric Descriptors cont. COI Loop for Social Presence- Open communication actions provide for extensive S-T, S-S, & S-P/E interactions and opportunities for student-led moderation of forums. Collaboration is required to build group cohesion and a rubric is provided.
  8. 8. Exemplary Rubric Descriptors cont. Educational Technology for COI- Technology could extensively facilitate a COI (e.g., email, assignment, forum, multimedia project, sharing tool, & synchronous meeting tools for group work)
  9. 9. Exemplary Rubric Descriptors cont. Instruction & Feedback for Teaching Presence- Extensive information provided on instructor feedback format with prompt turnaround time. Multi-modal direct instruction is mentioned. Instructor offers virtual office hours, format, & social media for classroom interactions.
  10. 10. Research Methods Action research that included ID feedback 2 raters 31 e-course syllabi across disciplines A purposive sampling of 13 graduate & 18 undergraduate e-courses, & Syllabi were not included in the study if the course had only been taught once online.
  11. 11. OCOISR Analysis Tool The rubric has the following scales: low (1 point), basic, moderate, above average, & exemplary (5 points).  Points awarded determine the course’s potential level of engendering an online COI: low (1-9 points), moderate (10-17 points), or high (18-25 points).
  12. 12. Reliability Indices AgreeStat was used to obtain Cohen’s Kappa (k), intraclass correlation coefficient (ICC), & Gwet’s 1st-order agreement coefficient (AC1) to determine reliability. Gwet’s AC1 addressed the k paradox that skews data when there’s high agreement between raters or an imbalance in the margins of the data tables (Cicchetti & Feinstein,1990; Gwet, 2008).
  13. 13. ICC Results Q1: Can raters obtain agreement in all categories of the rubric? There was poor to good degree of consistency among measurements, ICC = .821, p < .001 and 95% CI [.40, .932]. This was a 2- way random effects analysis of variance with interaction (Gwet’s AgreeStat, Version 2015.6.1, Advanced Analytics, Gaithersburg, MD, USA). Even though the ICC index indicated excellent reliability, we must consider the confidence intervals.
  14. 14. Rater Agreement Per Category There was favorable absolute agreement between raters for all five rubric categories. The weakest areas of percent agreement being ID for CP (55%), Tech (61%), & TP (65%); the strongest area was SP (87%) (Altman, 1991). These were 2-rater chance- corrected unweighted agreement coefficients (AgreeStat 2015.6.1).
  15. 15. Mean Rater Scores from OCOISR Above average Moderate Basic Basic Q2: What is the potential for engendering a COI in the online college courses based on their syllabi?
  16. 16. Common ID Feedback ID for CP Include higher order thinking activities such as case analysis or papers that require synthesis or evaluation of peer, self, and/or product. Ed Tech for COI Add group work for collaborating on projects. Use Schoology’s Media Album for students to share their projects and obtain peer feedback. COI Loop for SP • Provide a rubric for discussions to make criteria clear. • Provide discussions on readings to enhance learning from each other. Support for Learner Characteristics • Add accommodations statement. • Provide links to academic tutoring services. • Provide strategies for remediation and/or resources for building background knowledge. Instruction & Feedback for TP • Add specific online virtual office hours and format options. • Provide a description of direct instruction. • Add information on feedback response time & format.
  17. 17. CONTACT  Dr. Sandra Rogers, Instructional Designer srogers@shc.edu Twitter @teacherrogers Teacherrogers.com  Dr. Samir Khoury, Assistant Professor, & Director of MBA online program skhoury@shc.edu
  18. 18. REFERENCES Arbaugh, J. B., & Hornik, S. (2006). Do Chickering and Gamson’s seven principles also apply to online MBAs? The Journal of Educators Online, 3(2), 1- 18. doi:10.9743/jeo.2006.2.4 Baker, C. (2010). The impact of instructor immediacy and presence for online student affective learning, cognition, and motivation. The Journal of Educators Online, 7(1), 1-30. doi:10.9743/jeo.2010.1.2 Bloom, B. S. (Ed.), Engelhart, M .D., Furst, E. J., Hill, W. H., & Krathwohl, D.R. (1956). Taxonomy of educational objectives: The classification of educational goals. Handbook 1: Cognitive domain. New York, NY: Longmans, Green and Company. doi:10.1177/001316445601600310
  19. 19. References cont. Cicchetti, D. V., & Feinstein, A. R. (1990). High agreement but low kappa: II. Resolving the paradoxes. Journal of Clinical Epidemiology, 43(6), 551-558. doi:10.1016/0895-4356(90)90159-m Gwet, K. (2008). Computing inter-rater reliability in the presence of high agreement. British Journal of Mathematical & Statistical Methodology, 61(1), 29-48. doi:10.1348/000711006×126600 Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text- based environment: Computer conferencing in higher education. The Internet and Higher Education 2(2-3), 87-105. doi:10.1016/s1096-7516(00)00016-6
  20. 20. References cont. Granitz, N., & Greene, C. S. (2003). Applying E-marketing strategies to online distance learning. Journal of Marketing Education, 25(1),16-30. doi:10.1177/0273475302250569 Lemak, D., Shin, S., Reed, R., & Montgomery, J. (2005). Technology, transactional distance, and instructor effectiveness: An empirical investigation. Academy of Management Learning & Education, 4(2),150-158. doi:10.5465/amle.2005.17268562 Moore, M. G., & Kearsley, G. (1996). Distance education: A systems view. Belmont, CA: Wadsworth Publishing.
  21. 21. References cont. Rogers, S., & Van Haneghan, J. (2016). Rubric to evaluate online course syllabi plans for engendering a community of inquiry. Proceedings of Society for Information Technology & Teacher Education International Conference, 349- 357. Chesapeake, VA: AACE.

×