Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Evaluation of Settings and Whole Systems Approaches

1,360 views

Published on

This session was led as a Pre-Summit Workshop at the Healthy Minds | Healthy Campuses Summit 2016. Ben Pollard explored the question, "how do you know that your campus initiatives are making a difference?"

Published in: Education
  • Be the first to comment

  • Be the first to like this

Evaluation of Settings and Whole Systems Approaches

  1. 1. Evaluating Systems Approaches to Wellbeing Ben Pollard Director, Student Experience Evaluation and Research Unit Director, Strategic Initiatives, VPSO, UBC 1
  2. 2. Introduction  What are people hoping to get out of today?  Current roles and interests  Past Experience with evaluation 2
  3. 3. Overview  Overview of “normal” evaluation  Logic of change in individual focused social programming  Building a shared understanding of the outcomes we are trying to achieve, and the logic of what we are doing to achieve it  And how we will know whether we are being successful  Systems thinking  Evaluation in a complex policy area  Evaluating Policy and Culture initiatives  What are we trying to achieve?  How would we know if we are achieving it?  How do we know if what we are doing is achieving it? 3
  4. 4. Shortest overview of evaluation ever  Types of Evaluation  Summative  Formative  Developmental  How we can do it  Range of approaches  Action research to Randomized ControlTrials  How we are going to talk about it today:  Agnostic on “best way” – the best way is the one that answers the questions that you have, in a way that you and your stakeholders can trust to the level that can help you make the best decision  Your context, with broader context  Support learning from others, but recognize that much of this is going to be context specific  Importance of benchmarks – so you know if your 70% is bad 4
  5. 5. “Evaluative thinking” in a complex social policy environment YOU ARE GOINGTO GET IT WRONG… SO LEARN QUICKLY* *Applies to both your interventions, and your evaluation approach 5
  6. 6. Key tool: the general logic model for individual- focused interventions Input or Individuals Activity and outputs Immediate individual outcomes Individual Long-term outcomes Population Outcomes 6
  7. 7. What are the Population outcomes we are trying to achieve with wellbeing initiatives in a post- secondary setting?  At the POPULATION OUTCOMES level  How would we know whether we are achieving those outcomes?  What data do we have?  What data do we need?  Brief table discussion 7
  8. 8. General Types of Population Outcomes  Within the post-secondary setting:  Academic outcomes  Wellbeing outcomes  Organizational outcomes  Career outcomes (for staff/faculty)  Post-post-secondary  Life outcomes for students  Societal outcomes 8
  9. 9. Population Outcomes data  Surveys?  Administrative data?  Research projects? 9
  10. 10. Activity evaluation for individual focused initiatives  Program/intervention level evaluation, e.g.:  Quality  Relevance  Satisfaction  Effectiveness at driving the short term outcomes  Brief discussion of program level evaluations  Who regularly does these?  What tools do they use?  What are some best practices? 10
  11. 11. Building the links Activity and outputs Immediate individual outcomes Individual Long-term outcomes Population Outcomes 11
  12. 12. Building the links  Drivers of outcomes  Connections and drivers  Can lead to mid level measures to simplify evaluation  X changes y, y is more likely to lead to desired outcome  Do more x, and ensure it achieves change in y  E.g. more connections on campus leads to sense of belonging leads to improved mental health leads to improved academic and life success  Based on one time connection, or knowledge from literature  Farther out you go, the more factors affect achievement of outcomes 12
  13. 13. What works for who? Input or Individuals Activity and outputs Input or Individuals Activity and outputs Immediate individual outcomes Individual Long-term outcomes Population Outcomes 13
  14. 14. Program impact on population outcomes If the world was easy… just a question of math Number of participants in program X Program outcomes for that participant group = Population level outcomes 14
  15. 15. Discussion  Who has done a logic model for their programs?  Benefits of doing a logic model  Issues/barriers to doing so  Have you ever “run the math” for your Executive? 15
  16. 16. Wellbeing Break 16
  17. 17. But, the world isn’t easy. Input or Individuals Activity and outputs Input or Individuals Activity and outputs Immediate individual outcomes Individual Long-term outcomes Population Outcomes Culture Policy Environment Physical Environment Fiscal Environment Access to Services Competing Priorities External Factors 17
  18. 18. Why think about systems?  Interconnectedness of different components  Key to being collectively effective, especially on population outcomes  Addresses non-program issues which can be barriers or causes to try and turn them into supports  If one frog was sick, we would treat the frog. If every frog in a pond was sick, we would treat the pond  Opens up other avenues for addressing issues, of individuals within systems  Especially in post-secondary situations, where the population changes every 5-7 years 18
  19. 19. Ways of thinking about systems  Activity focused models  Healthy University self-assessment checklist  ISO style approach  At this level, doesn’t show you whether they are good or not  BUT – useful for some – e.g. fiscal environment; availability of wellbeing supports; physical environment, etc.  And those can be evaluated in other ways  Deep interconnected contingent models  Down the rabbit hole  Can lead to paralysis if not directed well  Understand interconnected driver models  E.g. Structural Equation model approach  OR, assume interconnection, look to your ability to influence where there are fundamental opportunities, and learn as you go 19
  20. 20. AnSEM Driver model of Immigrant Belonging 20
  21. 21. Another one: Visible Minority Trust in Community 21
  22. 22. Today’s Focus: Culture and Policy  In many ways, they are fundamental  Other pieces of the system (funding, built environment, activities, etc.) can flow from cultures and policy  Interlinked  Culture can drive policy change  Policy can drive culture change 22
  23. 23. Assessing Policy  Understanding policy framework  Types of policy (rule vs. suggestion vs. norm vs. strategic direction vs. strategic program direction)  Levels of policy (external, corporate, local)  Differentiating between statement of policy and enactment of policy  And enactment of unstated policy (strongly linked to culture)  Assessing key components of policy and their impacts on key desired outcomes  Evaluating approaches to changing policy and its enactment  Evaluating net impacts of initiative to changing policy and its enactment 23
  24. 24. Challenges in assessing policy  Challenge #1 – unless the target individual has had direct interaction in a bad situation, they will not knowWHICH policy is affecting them  Can’t ask them “What do you think of policy #14?”  Look to the EFFECT of the policy that is driving the issue  Can have target assess effect – e.g. stress related to exam schedules can point to policies around exam scheduling  Challenge #2: Implicit/murky policies – may not be clear that there is a capital P policy, but a strong practice-based policy  Challenge #3: Enactment - need to look at policies in PRACTICE, rather than just policies in word  Challenge #4: Interactions of policies –start looking at NET, and then tease apart which particular ones are driving 24
  25. 25. An alternative logic model for assessing policy Policies Implementers Understanding and Enacting (i.e. how it impacts their behaviour) Policy Environment created by the Implementer’s enactment of the Policies Impact on target group 25
  26. 26. Opportunities to Evaluate Policy  Quality of policy itself  Technical reading – does it give the tools, is it clear enough, does it clearly state purpose, etc.  Power/ability to address the issue  Implementers understanding and enacting  Policy translation  Look for unintended consequences  Things that don’t go with the PURPOSE of the policy, or have bad interaction effects with other policies  Implementers perception of the policy framework for supporting the policy intentions/or the issue of interest (i.e. wellbeing)  Individuals perception of the effects  Go backwards  Ask about the drivers and issues affecting their wellbeing, and then move backwards to address whether it is an enacting issue or an issue with the policy itself. 26
  27. 27. Assessing policy CHANGE  Look at it before and after policy change  Change in how policy implementers act?  Change in how target group perceives?  Attribution issues  Is it just a natural change happening?Try a control group or ask for direct attribution 27
  28. 28. Trying it out  Think of a policy you would like to make to support wellbeing  Make one up – your fantasy policy to improve wellbeing  What is it intended to do?  How will it work?  What levers does it use?  Who or what is the intended target?What changes do you want to see in the target?  Who is going to implement?  Example:  Policy to require Senate proposals to have completed a wellbeing checklist  Intention: build curriculum that is supportive of/not detrimental to student wellbeing and student learning  How it will work logic: Intended to increase consideration of wellbeing in curriculum design leading to wellbeing-supporting academic programs leading to better student wellbeing and student learning outcomes.  LEVER: Mandated checklist form/Senate requirement  Intended target: curriculum designer; real consideration of wellbeing  Implementer: Senate  10 minutes at your table 28
  29. 29. And pass it to another table  How would you assess its implementation and its effectiveness?  What factors would you have to consider in your evaluation?  What questions would you ask, and to who?  And are there other data you would want to know if it is effective?  How would you address attribution issues – is it this causing the effect? 29
  30. 30. Discussion 30
  31. 31. Wellbeing Break 31
  32. 32. Assessing Culture  Understanding culture  What do we mean by a culture of wellbeing?  Dimensions of culture  Different sub-cultures  Assessing key components of culture and their impacts on key desired outcomes  Evaluating approaches to changing culture  Evaluating net impacts of initiatives to change culture 32
  33. 33. Discussion  What is a culture of wellbeing?  What are the dimensions of that culture?  How does that culture fit with/compete with other cultures? 33
  34. 34. Issues and approaches to assessing culture  A person can be in many cultures at once  Different people interpret the same culture differently  Culture in a vacuum  Social response bias in some direct questions  Need for comparative value  Manifestation of Culture  How are people walking the talk  Culture may not always be clearly EXPRESSED, or understood in its expression, but it is often clearly FELT 34
  35. 35. Cultural InfluencerRocks, ripples and the shore Activity Individual and Population Outcomes Culture Student Student Student Cultural Influencer Cultural Members 35
  36. 36. Types of activities  Directed attempts at changing the culture through influencers.  Broad public education/messaging to change culture  Types of levers of influence:  Build understanding of importance  Build understanding of how to  Incentivize  Requirements  Tone setting 36
  37. 37. Assessing Culture: Ask the shore (students, staff, faculty, etc.)  Culture is the sum of all the rocks and ripples…So ask the shore what the waves feel like  Direct questions  Culture  Supports  Priorities on campus (comparative)  Indirect questions  The types of things you would expect someone who is in the culture to say if there is a culture of wellbeing  Sense of belonging  Community  Supported to succeed  Confident in ability to succeed  Link to personal, population and organizational outcomes 37
  38. 38. Example of building the links to organizational outcomes…  Strongest predictor of willingness to recommend UBC: Feeling that you belong on campus  There is a strong correlation between feelings of belonging on campus and willingness to recommend UBC to others (r = .707, p < .001) 1% 2% 7% 9% 25% 84% 6% 14% 18% 37% 63% 12% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Strongly Disagree Disagree Somewhat Disagree Somewhat Agree Agree Strongly Agree I feel that I belong at this campus AllYear Levels: I would encourage others to enroll at UBC Agree Strongly Agree 38
  39. 39. Assessing changes to Culture  Pre/Post  If something dramatic/large scale, can do perception, and perception of change related to that intervention  Over time, look for different patterns  Recognizing that for students at least, will be a substantially different cohort 39
  40. 40. Assessing Cultural Influencers Stage  What we think cultural influencers should do?  Believe it is important  Understand their role in influencing culture  Act like it is important/take actions that reflect a commitment to wellbeing and reflect their role in culture  Be supported in incorporating wellbeing  SIZE OF ROCK ANDTHE EXTENT OFTHE RIPPLE  how much influence each has on culture  How much they believe and act on it  And can they move other cultural influencers  People can be both the target and an influencer  Link to culture perceptions by the shore 40
  41. 41. How to measure Cultural Influencers  Direct measures  Ask them about it  Survey  Key informant interview (also a cultural intervention)  Relative importance questions  Surprisingly honest  To what extent do they consider wellbeing of their community?  And, if applicable, in their decision making?  Actions they have taken to promote  Triangulating measures  Perception of cultural influencers and how they enact culture  Values, beliefs, actions  N.B.: Perception of cultural influencers and resulting culture is going to be driven both by word and by action  If broad public communication is not taken up, rings hollow 41
  42. 42. Assessing NET change to cultural influencers over time  Cultural influencers can change as a result of many different interventions, and their own evolution  Over time, monitor  Perceptions  Values  Actions  And same for triangulation by shore (students/staff/faculty)  Attribution issues  Is it just a natural change happening?Try a control group or ask for direct attribution 42
  43. 43. Assessing activities to affect culture  Modified Kirkpatrick model  Reaction  Learning  Behaviour  Change (usually measured in terms of change in the environment/organization/culture later) 43
  44. 44. Evaluating Reaction  Satisfaction measures  Relevancy/usefulness measures  Process measures  Felt it was tailored for/spoke to them  Try to use a common set across different implementations and different types of activities to facilitate understanding of best practice, and what is working for who.  Different for broad public education or comms strategies, but can be done in a similar way  Assessment of quality of campaign  Assessment of visibility of campaign  Assessment of whether or not it “spoke to them”  Assessment of whether or not it rang true 44
  45. 45. Measuring Learning  If distinct, testable learning outcomes, use those.  IF NOT…  Self-perception of learning  For cultural influencers, can include:  Belief it is important  Understanding of the issue, and what they can do to affect  “Post-hoc pre-post”  Self assessment of the gains that they have made, based on where they were when they started, and where they are now  Allows for measures of GAIN  Addresses the issue of preaching to the choir  Different for broad public education or comms strategies, but can be done in a similar way  Focus on whether they feel it gave them new information/new understanding 45
  46. 46. Example of “Post-hoc Pre-Post” 46
  47. 47. Measuring Behaviour  Set a baseline: prior to intervention, ask them about their behaviour  Done the types of action you were hoping for?  Immediate outcomes: predicted behaviour  Likelihood that they will do something different as a result of the intervention; do the type of behaviour you were hoping they would  Whether they are more likely to do this behaviour than they would have before the training  Different for broad public education or comms strategies, but can be done in a similar way for immediate outcomes  Will they answer the “call to action”  Longer term – 3 months, 6 months, 12 months  Ask them about their behaviour (if they have done the prior to…  if they have done anything different  Pre-post on the perception of the influencer’s behavior by the shore… 47
  48. 48. Evaluating Culture Activity  Develop an intervention to develop a culture of wellbeing… or choose one that you are already trying  Spell out the logic of how you believe it will shift culture, and the effect that that shift will have on the personal and population wellbeing outcomes  Write it down  10 minutes 48
  49. 49. And pass it to the right.  How would you evaluate the proposed approach?  Write out an evaluation plan, showing the steps you would take.  How would you measure effectiveness?  How would you address attribution issues?  30 minutes 49
  50. 50. Discussion 50
  51. 51. Final Thoughts  Apply evaluative thinking, but don’t get hung up on perfection  YOUARE GOINGTO GET ITWRONG… SO LEARN QUICKLY*  *Applies to both your interventions, and your evaluation approach  Support learning across, and aggregation of results  Build tools that support comparison  Common and specific components  Don’t overdo it.You can get into some serious rabbitholes in getting the PERFECT evaluation.  And your results may not hold in the next version, given the number of factors at play in complex  There are many off the shelf that you can modify to your needs  Be constantly learning and adjusting. 51
  52. 52. A quick pitch  UBC would like to work with other institutions across BC to develop a student population wellbeing tool that provides a combination of epidemiological and more conceptually linked data (e.g. culture of wellbeing, broader outcomes) designed to support wellbeing initiatives, and better comparative data across BC and Canada  Opportunity for tailoring by institution  And cheaper too! 52
  53. 53. Questions? 53

×