2. DisclosureSlide
Presenter: Jeff Bachiu
Relationships with commercial interests: None
We work for NOSM and are invested in the success of the CCC program.
3.
4. CCC sites
• Fifteen Medium-sized
population centres
(pop. 5000 – 75,000)
outside main
campuses
• Family physician
offices plus hospital
with basic services
• But all sites are
different in:
Size
Facilities
Culture
Communities
• Sites:
• Kenora
• Dryden
• Sioux Lookout
• Fort Frances
• Sault Ste. Marie
• Hearst
• Kapuskasing
• Timmins
• Timiskaming Shores
• Manitoulin
• North Bay
• Bracebridge
• Huntsville
• Parry Sound
• Midland
5. Question
•Accreditation requires comparability of sites—no negative effect on learning from any site (ED-8)
•How to ensure that all CCC sites provide a comparable learning experience?
•Accreditation answer: through common curriculum and assessments
•But are we sure a community doesn’t effect learning?
7. Methods
Longitudinal review of assessment performance
Scored (not pass/fail) assessment data (by Theme) for:
•Phase 2 exams,
•Phase 3 rotation exams and
•LMCC scores
E2007-E2010 in this analysis (CCC in 2009- 2012)
All scores post remediation (if required)
To see if patterns might develop over time
8. Students per site
0
1
2
3
4
5
6
7
8
9
10
E2007
E2008
E2009
E2010
Different numbers of students per CCC site
number of students
cohort
9. Analysis
Indices of community student exam performance
For each exam event:
MS= (mean score per site) –(mean score all sites)
SH= count # MS>1
SL= count # MS<-1
Site score = Σ(SH-SL)
10. All sites -Phase 2 (n=12)
-6
-4
-2
0
2
4
6
E2007
E2008
E2009
E2010
site score
cohort
12. Large sites –Phase 2 (n=4)
-6
-4
-2
0
2
4
6
E2007
E2008
E2009
E2010
site score
cohort
13. Small sites –Phase 2 (n=4)
-6
-4
-2
0
2
4
6
E2007
E2008
E2009
E2010
site score
cohort
14. East sites –Phase 2 (n=8)
-6
-4
-2
0
2
4
6
E2007
E2008
E2009
E2010
site score
cohort
15. West sites –Phase 2 (n=4)
-6
-4
-2
0
2
4
6
E2007
E2008
E2009
E2010
site score
cohort
16. Discussion
•Site performance goes up
•Site performance goes down
•No stability over more than 3 years
•Variation in performance between sites more attributable to student differences?
•One site consistently below par in Phase 2 and falling in overall – need for follow up
17. Conclusions
•No major discrepancies
•One site requiring further investigation
•The data are indicators of possible problems –may be attributable to many causes including random variations
•Ongoing monitoring –longitudinal tracking improves over time
•Tracking and follow-up part of accreditation
•And of accountability –to students, communities, and Northern Ontario