Assessment presentation

145 views
102 views

Published on

Sample Project

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
145
On SlideShare
0
From Embeds
0
Number of Embeds
18
Actions
Shares
0
Downloads
1
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Assessment presentation

  1. 1. Assessment in the Basic Speech Course Purpose, Plan, Process,Continuous Improvement and Benefits
  2. 2. Workshop PresentersMyra Walters, ModeratorChair, Department of speech Communication and Foreign LanguagesDr. Katie Paschall, PresenterProfessor of Speech Communication, Collier County CampusDr. Kevin Coughlin, PresenterDirector of Research, Planning and Development
  3. 3. About Edison State College• Located in Southwest Florida• Has three campuses and one center• Provides instructional and support services for more than 25,000 credit and 3000 non- credit students
  4. 4. Oral Communication Assessment: Purpose• In May 2009, our basic speech courses were designated as part of the gen ed core• InSeptember, 2009 we were told we needed to assess the oral communication competency.
  5. 5. Developing an Assessment Plan• How will we record speeches?• How many speeches do we need to record?• How much will it cost to fund this assessment project?• Where will the money come from to support this assessment project?
  6. 6. Developing an Assessment Plan• What type of speech should we record?• What should be the required length of the assessment speech?• How will we record and prepare speeches for viewing?
  7. 7. The Assessment Plan• Held assessment workshop facilitated by Dr. John Fredericks from Miami Dade• Decided upon topic, type of speech and length of speech• Received assistance from Dr. Coughlin to determine the number of speeches to be recorded and which classes would be recorded.
  8. 8. The Assessment Plan• Faculty were sent e-mails notifying them that their classes were selected• Speeches were recorded using digital cameras• Edison online student assistants uploaded speech videos to the colleges learning management system into three separate playlistsSpeech Link
  9. 9. The Assessment Plan• Assessment teams were given one week to evaluate speeches• Rubrics with rating scores were delivered to the department chairs office• Nine sets of rubrics were delivered to the assessment office for interpretation and evaluation• Full time faculty met to discuss assessment results
  10. 10. Assessment Process: Overview•Pilot Study: focus on inter-rater correlations and reliability estimates (fully-crossed; small number of speeches)•Review: Speech faculty consider results and refine rubric and other aspects of rating process•Full Study: Focus on student performance; secondary focus on inter-rater correlations and reliability (nested or sort-of-crossed; all speeches)
  11. 11. Assessment Process: Sample• Population: During the Fall 2011 term, 57 sections of SPC 1017 and SPC 2023 were offered on 3 campuses and 2 centers• Sample considerations: We wanted a representative sample; we wanted as many speeches that the rating teams could feasibly assess• Sample: 5 sections of SPC 1017; 2 sections of SPC 2023; all students from each selected section; 167 speeches (835 minutes to rate)
  12. 12. Assessment Process: Pilot• Design Characteristics • 14 Speeches • Fully crossed – All raters rated all speeches • 9 raters (3 groups of 3) • 6 rubric dimensions or outcomes• Types of Results • 14 tables and a summary outline • Means and standard deviations (percentage scores?) • Inter-rater reliability estimates by rubric demension • 9 raters • 6 rubric dimensions or outcomes
  13. 13. Assessment Process: Pilot• Results • Consider the tables (that Pdf file entitled pilot . . .) • All raters (aggregated) • All Cronbach’s alphas > .80 • Highest = Nonverbal-Physical • Lowest = Language • Reliability for each sub-group was generally lower than the the entire group of raters • Pairwise correlations highlighted areas of disagreement as tot the meaning of each rubric dimension • Again . . . the tables
  14. 14. Assessment Process: Full Study• Design characteristics • Two foci: Student performance & rubric performance • Nested (somewhat crossed) • 127 speeches (93 after exclusions) • 9 raters nested in 3 groups (3 raters per group) • Each group rated between 40 and 55 speeches • 8 dimensions (up from 6); 4 levels of rating• Data considerations o For student performance, we included only speeches that received a rating from each rater from the relevant group o For reliability, we considered all observations
  15. 15. Assessment Process: Full Study• Types of results • 25 tables and a summary outline • 19 tables associated with student performance • Achievement of student learning outcmes • Differential performance across campuses/centers • Differential performance across dimensions • ANOVA’s, means, frequency distributions • 6 tables associated with inter-rater correlations and reliability • Giant correlation table • Individual reliability tables
  16. 16. Assessment Process: Full Study• Results Highlights • Frequency distributions indicate that students did their best in terms of Content • ANOVA yielded evidence that student performance differed significantly by campus in only 2 of 8 dimensions (outcomes): Body and Conclusion • Giant correlation table (show them the table) • Second rating group had the highest and lowest inter-rater correlations • Rater 3 and Rater 6 had the strongest levels of agreement • Cronbach’s alphas exceed .9 in all cases)
  17. 17. Continuous Improvement• Revised rubric to better assess the public speaking competencies• Revised directions for the final speech• Planned professional development through our TLC• Planning a workshop to increase rubric reliability• Developing a repository of sample student speeches for instructional purposes• Planning to repeat the assessment in Spring 2013
  18. 18. Benefits• It can be done!• We can do it!• Camaraderie among faculty• Meaningful discussions• Gained insight into what is being taught in the classroom• Development of relationships with other departments (e-learning, technical support, assessment office)

×