2. Perspectives on Successful Transitions for
Internationally Educated Health Professionals
Sandra Monteiro, PhD
March 2, 2017
3. Impact and Outcomes
Measuring Impact – typically means
measuring change
Measuring Outcomes – typically means
measuring learning, skill, competence
2/22
5. What are the
outcomes of a
transition program?
Customizing program evaluation
Sandra Monteiro
David Rojas
4/22
6. Agenda
• Describe the pre-residency education
program
• Who should define success?
– Examples of studies on predicting and measuring
outcomes
• What are we doing to measure “successful”
outcomes? But also “failed” outcomes?
– Innovative way to think about program evaluation
5/22
7. Pre-Residency Program at
Touchstone Institute
Provide international
medical graduates with
an exceptional learning
experience on medical
training and practice in
Canada with a focus on:
– Patient-Centred
Communication
– Collaboration
– Professionalism
– Health Advocacy
– (Medical Expert)
6/22
9. Defining Success
Touchstone Institute?
– Integrate best and defensible practices in education and
assessment
– Able to address stakeholders’ needs
– Provide international medical graduates with an exceptional
learning experience
Medical Residents?
– Preparation for a successful residency
– Prefer not to attend
Public? Residency Programs? Regulators?
– Residents are ready to provide high quality care
8/22
10. Markers of success in education
Focused feedback
Goal directed teaching
Opportunities for deliberate practice
9/22
11. Markers of success in education
Focused feedback
• Information on their strengths
• Opportunities to improve
• Not specific to specialty
Goal directed teaching
• Whose goals should we evaluate?
Opportunities for deliberate practice
• Small Group Simulations
• Not everyone is equally engaged
10/22
12. Markers of success in residency
From 2005 to 2009, … the CMG pass rate was
95%, while the IMG pass rate was 76%
Can the PRP do anything to address this?
What factors influence these outcomes?
*Thomson, G., & Cohl, K. (2011). IMG SELECTION. Independent review of access to postgraduate programs
by international medical graduates in Ontario. Volume 1: Findings and Recommendations; Volume 2: Analysis
and background.
11/22
14. 13/22
• Medical Council Evaluating Exam Scores
– Simulated office orals
– Short answer management problems
• Demographics of Medical Residents
– Country of birth (Human Development Index)
– First language
– Professional experience; internship
15. 13/22
• First language other than English related to positive
outcome on exam
• Prior professional experience and country of birth, slight
benefit for outside Canada
16. Studies of outcomes have an impact
• The development of a multi-institution data
sharing, processing, and governance
agreement between Ontario PGME Programs
and the CFPC and RCSPC
14/22
19. Program Evaluation in Health Professions Education
18/22
• Identify what works
• Often focused on satisfaction ratings
20. Program Evaluation in Health Professions Education
Outcome
based
• Rigid model
• Focus only
on one
outcome
Theory based
• Rigid model
• Acknowledges
complexity but
is unable to
capture it
Holistic program
evaluation
• Flexible model
• Understand
emergence
• Acknowledges
and aims to
capture
complexity
18/22
21. Holistic
evaluation
Parker, K. (2013). A better hammer in a better toolbox: Considerations for the future of program evaluation
Acknowledges and
aims to capture
complexity through
emergent
properties
22. How do we define emergence?
Emergent properties are those unexpected behaviors
exhibited by the system that cannot be explained by
elements, or the interaction of the components in the system.
Emergence from
Design
• Director
• Coordinator
Emergence from
Expectations
• Students
• Instructors
Stakeholders
20/22
23. Example of unintended outcome
Not engaging with simulated sessions
Trainee dissatisfaction with PRP
• Not enough specialty content; doesn’t apply
• Frustration with transition program, seen as a
hurdle and a burden to overcome
• Lack of communication to trainees
21/22
24. Future Directions
• Communicate with all stakeholders to
better define goals of the PRP and areas
for improvement
• Address intended but failed outcomes and
unintended negative outcomes
22/22
These are not mutually exclusive
e.g. How is maternal and infant health impacted by changes to the ObGyn residency curriculum? Does the introduction of a new assessment change the way trainees enter the workforce?
e.g. what are the consequences (on learning, skill, competence) of introducing a transition program for internationally trained medical residents?
Introduce myself -
This is the question posed to the previous panel, but really that’s asking about the key ingredients for a successful program.
I’d rather not list the key ingredients of successful programs –
DR – I completely agree with this suggestion, I think questioning the question that was proposed to you guys is a strong start.
Based on the next slide, I think it will be useful to mention the relationship between program evaluation and measuring “success”. Identifying if a program is successful can only be attained through evaluation. However, your point is that multiple stakeholders might have a different understanding of what “successful mean” so an innovative evaluation approach needs to utilized.
instead I’d like to propose that what we need to think about is how to customize a training program to maximize positive outcomes, how to evaluate changes over time and I would emphasize that success means different things to the stakeholders involved, so consistent open communication is essential to ensure that what is expected is clear and what is emergent is valued
DR- Perfect this relates to the last comments I made on slide #2.
This is somewhat of a unique transition program as the trainee group are already pre-selected for practice – they are all headed to medical residency programs in Ontario. Given they have already been assessed for entry level skills, we try to balance the needs of our trainees with those of the residency/medical schools
The current direction of medical education in Canada, and several countries worldwide, is a competency based framework – which emphasizes key roles on a competency development line rather than a time-line which may emphasize years of training
Underlined aspects that we want to measure – do some of our outcomes include the perception of an exceptional learning experience
Asynchronous learning opportunities - allows trainees to customize training to their schedule. Also provides opportunity for ongoing support by making online modules available after training
Supportive faculty, simulated training sessions that encourage practice of skills, opportunity to receive individualized support through personal coaching
these are characteristics of training programs recommended in the literature, also consistent with what past trainees have suggested for improvements over the years
But how do we know we are in fact achieving our goal – do trainees have an exceptional experience? Does the PRP do anything to help them prepare for residency? Are there more positive or negative outcomes? Problem is we don’t really know what the outcomes are.
DR- I would add the idea of the processes, maybe the PRP outcomes not being a directly related to the processes in the program. What other factors are influencing the program outcomes? Why? How do the program processes relate to the outcomes?
Markers of success
Touchstone Institute?
We can describe our approach to education, but stakeholders needs have changed and we need to re-evaluate them so we can customize our approach. We have involved external reviewers to evaluate the program – remember I suggested that one way to ensure objectivity in the measurement or traingin of subjective states is to increase transparency – DR (perfect)
Trainees?
Will they have a successful residency?
Unfortunately, they have a limited understanding of their skill level and education needs – expectations may not match PRP goals and this leads to critical misunderstanding– DR (perfect)
Public?
Will trainees be ready to provide high quality care? We cannot tell in 2 to 4 weeks – so stakeholders with this goal should not influence our goals – in this time we cannot claim to reliably identify safe and unsafe physicians– DR (perfect)
DR- I would also add that the program is trying to provide the needs required by residency/medical schools. The minimal level of skill has already been assessed so the goal of the PRP is not to create safe or unsafe physicians. That has been assessed before.
Focused feedback – but does it meet the trainees expectations? Do they know what to do with the information?
Employ goal directed teaching – goals developed by curriculum committee, physician consultants etc. but does it match the trainees needs? What about programs in Ontario?
Opportunities for deliberate practice – but trainees value specialized training and devalue training on intrinsic roles
Overall we cannot measure success without first understanding how our goals match or mismatch with their goals. DR- perfect. Following some of the ideas that I have mentioned earlier, I think there are 2 levels of mismatch, one is at the goal level (trainees vs curriculum committee goals) and the other one is at the process level (trainees expect a particular process to learn something and the committee has chosen a different one). Both of these cases, in my opinion, affect the measurement of the program success.
Focused feedback – but does it meet the trainees expectations? Do they know what to do with the information?
Employ goal directed teaching – goals developed by curriculum committee, physician consultants etc. but does it match the trainees needs? What about programs in Ontario?
Opportunities for deliberate practice – but trainees value specialized training and devalue training on intrinsic roles
Overall we cannot measure success without first understanding how our goals match or mismatch with their goals. DR- perfect. Following some of the ideas that I have mentioned earlier, I think there are 2 levels of mismatch, one is at the goal level (trainees vs curriculum committee goals) and the other one is at the process level (trainees expect a particular process to learn something and the committee has chosen a different one). Both of these cases, in my opinion, affect the measurement of the program success.
Stats for passing Royal College exam - On the examinations of the Royal College of Physicians and Surgeons of Canada (RCPSC), the relative success rates between IMGs and graduates of Canadian medical schools are less striking, but still different. From 2005 to 2009, for candidates on their first attempt, the CMG pass rate for primary specialty examinations was 95%, while the IMG pass rate was 76%; for subspecialty examinations, the success rates were 96% and 75% respectively.
DR- I really like the idea of questioning if the PRP can do something about this. In some of the data that I have been analyzing I have realized that sometimes the differences among residents could have been fixed/prevented at the selection process. A 2-4 week course is not going to be able to fix everything.
Norcini
Norcini
Half the items assessed socio cultural and educational challenges encountered by IMGs – novel study at the time
Program evaluation is a systematic approach of data collection and analysis, generally used to solve quality and effectiveness questions about programs (Office of Planning, 2010). Evaluation in Health Professions Education (HPE) is recognized as a key element for assuring high-quality programs (Norman, 2002; Parker, Burrows, Nash, & Rosenblum, 2011; Parker, 2013). However, traditional evaluation approaches in HPE may not address the intrinsic complexity of educational programs. This limitation prompted evaluators to re-conceptualize evaluation as requiring a holistic approach with guiding questions such as “how and why is the program working?” and “what is happening within a program?” Although the conceptual basis of holistic evaluation approaches has been described previously (Haji, Morin, & Parker, 2013), there are currently no clear methods for how to conduct such evaluations. In this paper, we argue that sub-disciplines in engineering provide tools that could be reimagined as the methods for conducting holistic program evaluation.
Program evaluation is a systematic approach of data collection and analysis, generally used to solve quality and effectiveness questions about programs (Office of Planning, 2010). Evaluation in Health Professions Education (HPE) is recognized as a key element for assuring high-quality programs (Norman, 2002; Parker, Burrows, Nash, & Rosenblum, 2011; Parker, 2013). However, traditional evaluation approaches in HPE may not address the intrinsic complexity of educational programs. This limitation prompted evaluators to re-conceptualize evaluation as requiring a holistic approach with guiding questions such as “how and why is the program working?” and “what is happening within a program?” Although the conceptual basis of holistic evaluation approaches has been described previously (Haji, Morin, & Parker, 2013), there are currently no clear methods for how to conduct such evaluations. In this paper, we argue that sub-disciplines in engineering provide tools that could be reimagined as the methods for conducting holistic program evaluation.
Our goal is to understand the program from different angles. Identify if the processes that are planned produce the outcomes that are planned or not. And what outcomes emerge that are not planned?
DR- Here is some rationale that I have developed on the last few months regarding the holistic evaluation approach:
The holistic evaluation approach is a response to the lack of tools to account for the intrinsic complexity of educational programs in health professions education. A main characteristic of a complex system (using systems theory) is that the complex systems exhibit unexpected behaviours, called emergent properties. Emergent properties are the key characteristic that differentiates a complex system from a non-complex system.
Therefore, as a response to the lack of tools to capture complexity, the holistic evaluation approach suggest to not only pay attention to the planned processes and outcomes, but also shift the focus to include the emergent process and outcomes (intrinsic to complex systems aka. educational programs). In doing so, the holistic evaluation outcome will be able to provide a more in-depth explanation of the dynamics of the program, compared to other evaluation approaches. The emergent properties of a program are usually, but not limited, miss alignments between the stakeholder expectations at the goals, processes and resources levels. (Let me know if this is clear)
If something is different from what it was expected, and also if something is not captured but supposed to appear .
DR- Lately I have been including the basic definition of emergence provided by systems theory : Emergent properties are those unexpected behaviors exhibited by the system that cannot be explained by elements, or the interaction of the components in the system.
Then I translate this from systems theory to education. The components of an educational program refer to the stakeholders, goals, processes and resources of the system. Different groups of stakeholder conceptualize goals, processes and resources different therefore we analyze each stakeholder group individually (e.g., designers, instructors, residents). If, when analyzing the program, a goal, process or resource is different from what was planned, it is consider emergent. If a goal process or resources was planned and cannot be identified when analyzing the program, the absence is considered emergent as well. Let me know if this is clear.
Spreads negative attitudes and can affect the trainees’ expectations