Closing the Knowledge Gap BetweenEvaluators and Stakeholders:Competencies, methods and technologies to optimiseevaluator l...
Presentation Overview Objective: To share with you some of theapproaches, evaluation methods andtechnologies that I have ...
Useful attitudes, aptitudes and skillsPart 1
What Knowledge Gap? Basic premise: At the outset of anevaluation the stakeholders are moreknowledgeable about the evaluan...
Where is the Knowledge Gap? It depends, but +/- in all components of therealistic formula:Mechanism + Context = Outcomes(...
Overcome the Knowledge Gap Useful approaches, attitudes and skills: Approach each evaluation as a learning experience; ...
How to integrate opportunities forlearning into the evaluationmethodologyPart 2
Small Evaluation Example:Bridging Political BoundariesA review of occupational studies in public health care; a twoperson ...
1: Content analysis:50 reference documents2: E-survey: 300organizations with follow-uptelephone interviews5: F2F interview...
Concentric Circles MethodologyBenefits / Outcomes The lines of evidence implementation sequence was cost-effective and ti...
Large Evaluation ExampleA strategic policy evaluation of a $500 million Fund with a$750K budget, 7 team members, a short d...
TheFunderCanadianOrganizationsInternationalOrganisationsAfricanPartners andBeneficiariesSnowball MethodologyBridging Infor...
TheFunderCanadianOrganizations InternationalOrganisationsAfricanPartners andBeneficiariesSnowball MethodologyData Collecti...
Snowball MethodologyBenefits / Outcomes The evaluation lines of enquiry were refined and new onesopened in a process of d...
Demonstrate how CAQDAS canenhance evaluator learning andevaluation qualityPart 3
Use of CAQDAS in Social SciencesLiterature Search Summary “In an examination of the Sociological Abstractsdatabase, we fo...
Use of CAQDAS in EvaluationsLiterature Search Summary Scholars Portal Search String: CAQDAS (long &short) in Anywhere, Pu...
Some CAQDAS Method Articles 2004 Software and Method, Intl JRN of Social ResearchMethodology 2006 Using CAQDAS to Develo...
Some CAQDAS Applied Articles 2009 Qualitative Data Analysis - A Proceduralcomparison, JRN of Applied Sport Psychology 20...
Use of CAQDAS in EvaluationsWhy Not Us? CES-NCC Annual Learning Event – ThematicLunch Roundtable Summary: little awarene...
Use of CAQDAS in EvaluationsWhy Not Us? CES-NCC Annual Learning Event – ThematicLunch Roundtable Summary: Demand: remain...
Use of CAQDAS in EvaluationsThe Demonstration Effect – Free Software QDA Miner Lite – this is an easy-to-use version of t...
Use of CAQDAS in EvaluationsThe Demonstration Effect – Not So FreeJune 14, 2013 www.rbmg.ca 23
Use of CAQDAS in EvaluationsThe Demonstration Effect – Not So FreeJune 14, 2013 www.rbmg.ca 24
Use of CAQDAS in EvaluationsThe Demonstration EffectGo to Atlas.ti DemoJune 14, 2013 www.rbmg.ca 25
Closing the Knowledge Gap BetweenEvaluators and Stakeholders:Competencies, methods and technologies to optimiseevaluator l...
Upcoming SlideShare
Loading in...5
×

Closing the Knowledge Gap Between Evaluators and Stakeholders

176

Published on

from Werner Meier

Published in: Business, Technology, Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
176
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Closing the Knowledge Gap Between Evaluators and Stakeholders

  1. 1. Closing the Knowledge Gap BetweenEvaluators and Stakeholders:Competencies, methods and technologies to optimiseevaluator learning during the evaluation processPrepared for the 2013 Canadian Evaluation Society Conference:Evaluation Across BoundariesPrepared by Werner Meierhttp://www.RBMG.caJune 10, 2013
  2. 2. Presentation Overview Objective: To share with you some of theapproaches, evaluation methods andtechnologies that I have found useful inbridging the knowledge gap. Part 1 – useful attitudes, aptitudes and skills; Part 2 - how to integrate opportunities forlearning into the evaluation methodology; Part 3 - demonstrate how CAQDAS can enhanceevaluator learning and evaluation quality.June 14, 2013 2www.rbmg.ca
  3. 3. Useful attitudes, aptitudes and skillsPart 1
  4. 4. What Knowledge Gap? Basic premise: At the outset of anevaluation the stakeholders are moreknowledgeable about the evaluand thanthe evaluators. If this statement were not true, then theevaluators are likely to be in a conflict ofinterest and unable to provide an independentand impartial perspective. If the statement is true, then what is thenature of the Knowledge Gap?June 14, 2013 4www.rbmg.ca
  5. 5. Where is the Knowledge Gap? It depends, but +/- in all components of therealistic formula:Mechanism + Context = Outcomes(Realistic Evaluation, Pawson & Tilley 2008) Mechanism: origin, foundationalprinciples, influential decisions, micro/macroprocesses, socio-cultural acceptance, outcometriggers, etc. Context:political, institutional, economic, social, cultural,environmental, etc. Outcomes: performanceJune 14, 2013 5www.rbmg.ca
  6. 6. Overcome the Knowledge Gap Useful approaches, attitudes and skills: Approach each evaluation as a learning experience; Identify your information needs and develop a research plan; Use soft skills (EIQ) to enhance openness and collaboration; Employ critical thinking techniques (Scriven & Paul 1987); Use appropriate software applications to systematicallyanalyse large amounts of quantitative and qualitative data; Maintain an impartial attitude, let the evidence speak foritself and be prepared to demonstrate your evidence base.June 14, 2013 6www.rbmg.ca
  7. 7. How to integrate opportunities forlearning into the evaluationmethodologyPart 2
  8. 8. Small Evaluation Example:Bridging Political BoundariesA review of occupational studies in public health care; a twoperson evaluation with a $50K budget, an ampleimplementation timeframe, but a complex and contentioustopic with ambitious pan-Canadian outcomes to assess. The Concentric Circles Methodology was implemented in asequential manner “from the outside-in”. It involved data collection and analysis for each line ofevidence in a predetermined sequence. The acquired understandings and insights were reinvestedin the design of subsequent data collection instruments andevaluation techniques.June 14, 2013 8www.rbmg.ca
  9. 9. 1: Content analysis:50 reference documents2: E-survey: 300organizations with follow-uptelephone interviews5: F2F interviews:w/ 4 program managersLines of Evidence Sequence4: Tel. interviews:w/ 25 committee members3: F2F interviews:w/ 10 senior managers of keyhealth sector professionalassociations1. Reference Documents:GoC policy statements, Dept. RPP /DPRs, program and project files;performance data & reports2. Stakeholders:Health service deliveryproviders across Canada3. Key Stakeholders:Canadian Medical Association,Canadian Nurses Association,College of Surgeons andPhysicians, etc.4. Advisory Cmtes:Health Delivery and HumanResources, including P/Tgovernment representatives5. Client Programme:Health Canada and HRSDCData Sources Data Collection TechniquesConcentric Circles MethodologyJune 14, 2013 9www.rbmg.ca
  10. 10. Concentric Circles MethodologyBenefits / Outcomes The lines of evidence implementation sequence was cost-effective and time sensitive; The available time with key informants was optimised; The potential for bias resulting from precipitate contact withthose most invested in the programme was avoided; and The knowledge-base of the evaluators was gradually builtup with each successive data gathering/analysis step sothat better informed questions were asked and answerscould be clarified in light of the data already acquired.June 14, 2013 10www.rbmg.ca
  11. 11. Large Evaluation ExampleA strategic policy evaluation of a $500 million Fund with a$750K budget, 7 team members, a short data collectiontimeframe, large in scope with 33 projects in 6 sectors and pan-African outcomes to assess.June 14, 2013 11www.rbmg.ca
  12. 12. TheFunderCanadianOrganizationsInternationalOrganisationsAfricanPartners andBeneficiariesSnowball MethodologyBridging Information andCultural BoundariesJune 14, 2013 12www.rbmg.ca
  13. 13. TheFunderCanadianOrganizations InternationalOrganisationsAfricanPartners andBeneficiariesSnowball MethodologyData Collection StagesStage 1 – Documentcontent analysis,interviews, teamdiscussion and sectorwork planningStage 2 – Datacollection, analysis,sector briefing papers,and team discussionStage 3 – Datacollection, analysis,sector briefing papers,team discussion andmission planningImplementation challenges, solutions, findings and lessons werereinvested in sharpening the focus on emerging themes and issues.Stage 4 – Datacollection, analysis,preliminary findingsbriefs, team discussionand report preparationJune 14, 2013 13www.rbmg.ca
  14. 14. Snowball MethodologyBenefits / Outcomes The evaluation lines of enquiry were refined and new onesopened in a process of discovery and learning; Standardised data analysis and reporting techniques weredeveloped iteratively as the needs were identified and theevaluation become progressively focussed ; The sector evaluators were better prepared to engageproject stakeholders and beneficiaries on outcomesachieved and their sustainability; and Presentations and discussions with the client programmeon Fund level policy and programme issues was groundedby the data and findings at the sector level.June 14, 2013 14www.rbmg.ca
  15. 15. Demonstrate how CAQDAS canenhance evaluator learning andevaluation qualityPart 3
  16. 16. Use of CAQDAS in Social SciencesLiterature Search Summary “In an examination of the Sociological Abstractsdatabase, we found only 31 references to eitherNud*ist, Atlas.ti, NVivo, winMAX, Kwalitan, MAXqda, Qualrus, or Hyperresearch since1990, compared to 220 references toSPSS, SAS, and Stata.” The Wow Factor: Preconceptions and Expectations for DataAnalysis Software in Qualitative Research, Katie MacMillan andThomas Koenig, Social Science Computer Review, Vol. 22 No. 2, Summer2004 179-186.June 14, 2013 16www.rbmg.ca
  17. 17. Use of CAQDAS in EvaluationsLiterature Search Summary Scholars Portal Search String: CAQDAS (long &short) in Anywhere, PubYear 2008 to present. All Journals = 98 Journals w/Method/Methodology in Title = 12 International Journal of Social ResearchMethodology = 8 Journals w/”Evaluation” in title = 2 Studies in Educational Evaluation = 2 AJE/CJPE: = 0June 14, 2013 17www.rbmg.ca
  18. 18. Some CAQDAS Method Articles 2004 Software and Method, Intl JRN of Social ResearchMethodology 2006 Using CAQDAS to Develop a Grounded TheoryProject, Field Methods 2009 Advances in Qualitative Methods, Intl JRN ofQualitative Methods 2009 The use of CAQDAS in educational research, IntlJRN of Research & Method in Education 2011 How Technological Developments Change OurWays of Data Collection, Transcription andAnalysis, Forum for Qualitative Social Research 2013 Using diagrams to support the researchprocess, Qualitative ResearchJune 14, 2013 18www.rbmg.ca
  19. 19. Some CAQDAS Applied Articles 2009 Qualitative Data Analysis - A Proceduralcomparison, JRN of Applied Sport Psychology 2011Membership categorization and the accomplishmentof coding rules in research team talk, Discourse Studies 2012 Developing midwifery practice through work-basedlearning, Nurse Education in Practice 2012 Human vs CAQDA Ratings of Spiritual Content inDreams, The Humanistic Psychologist 2012 Progressive Focussing and Trustworthiness inQualitative Research, Management Intl Review 2013 Food safety practices and managersperceptions, Intl JRN of Hospitality ManagementJune 14, 2013 19www.rbmg.ca
  20. 20. Use of CAQDAS in EvaluationsWhy Not Us? CES-NCC Annual Learning Event – ThematicLunch Roundtable Summary: little awareness of what this type of softwarecan do and how it can be applied perception that statistical software has madethe processing of quantitative data morereliable, but not so for CAQDAS. most evaluators are “comfortable” with theircurrent qualitative data processing techniques.June 14, 2013 20www.rbmg.ca
  21. 21. Use of CAQDAS in EvaluationsWhy Not Us? CES-NCC Annual Learning Event – ThematicLunch Roundtable Summary: Demand: remains soft for more systematic andrigorous qualitative data analysis that isdemonstratively evidence-based. Procurement: staff unfamiliarity, a lack ofunderstanding of its utility, cost of licensingand frequency of renewals with added costs. Training: effort required to develop thetechnical skills considered disproportionate inrelation to the perceived utility.June 14, 2013 21www.rbmg.ca
  22. 22. Use of CAQDAS in EvaluationsThe Demonstration Effect – Free Software QDA Miner Lite – this is an easy-to-use version of the popularMAXQDA software. Saturate – a smart solution tomemo, code, categorize, search, and archive your textdata, tabular data, audio data, and Web pages, all in a multi-user environment. TextStat – produces word frequency lists, concordances. Dedoose – for analyzing text, video, and spreadsheet data;web-based pay for the months you use the app. CDC EZ-Text – a software program developed to assistresearchers create, manage, and analyze semi-structuredqualitative databases. CAQDAS Networking Project (http://caqdas.soc.surrey.ac.uk/June 14, 2013 22www.rbmg.ca
  23. 23. Use of CAQDAS in EvaluationsThe Demonstration Effect – Not So FreeJune 14, 2013 www.rbmg.ca 23
  24. 24. Use of CAQDAS in EvaluationsThe Demonstration Effect – Not So FreeJune 14, 2013 www.rbmg.ca 24
  25. 25. Use of CAQDAS in EvaluationsThe Demonstration EffectGo to Atlas.ti DemoJune 14, 2013 www.rbmg.ca 25
  26. 26. Closing the Knowledge Gap BetweenEvaluators and Stakeholders:Competencies, methods and technologies to optimiseevaluator learning during the evaluation processThank You for AttendingFrom Werner Meierhttp://www.RBMG.caJune 10, 2013

×