• Save
Ces 2013   towards a cdn definition of evaluation
Upcoming SlideShare
Loading in...5
×
 

Ces 2013 towards a cdn definition of evaluation

on

  • 345 views

From Mary Kay Lamarche

From Mary Kay Lamarche

Statistics

Views

Total Views
345
Views on SlideShare
345
Embed Views
0

Actions

Likes
0
Downloads
0
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Ces 2013   towards a cdn definition of evaluation Ces 2013 towards a cdn definition of evaluation Presentation Transcript

    • +With special thanks toErin Sulla, Universityof AlbertaTowards aCanadianDefinition ofEvaluationCheryl Poth, CEUniversity of AlbertaMary KayLamarche, CECairineChisamore, CEAlvin YappMasters studentUniversity of Albertahttp://absoluteadvantage.org/images/contentmgmt/edition_38/evaluation_definition.jpg
    • +OutlineWho & Why of this PresentationHow & What did we find in our work?Tamed Literature SearchWild West of Social MediaUnanticipated Survey ChallengesWhere should we go from here?2
    • +Who & Why of this presentation? From each panelist’s perspective What were the beginning conversations for this work? How did we undertake this work? What pressing needs did we anticipate? What was surprising about this process?3
    • + How & What did we find in our work?Tamed Literature Search Rationale for what a literaturesearch can offer? Access the perspectives of awide variety of people Systematic approach tominimize bias Process involved in boundingthe search (i.e., parameters) Years: 2008-present Databases used Keywords for search Additional limits: scholarly peer-reviewed, full-text, English4
    • +How are we defining evaluation in Canada? Government remains summative-focused due to accountability purpose forgeneralizability E.g., “Treasury Board of Canada Secretariat Policy on Evaluation 3.1: In thegovernment of Canada, evaluation is the systematic collection and analysis of evidenceon the outcomes of programs to make judgments about their relevance, performanceand alternative ways to deliver them or to achieve the same results.” Others are focused on a more narrow scope for purpose for understandingspecificities E.g., Evaluation is interested in understanding what is happening in a specific programrather than attempting to generalize findings more broadly; it focuses on naturallyoccurring groups and events that are presented rather than controlling the setting andfocusing on isolated variables (Levin-Rosalis, 2003). The most common definition relates to an approach E.g., Developmental evaluation provides right timed feedback and data that arenecessary for supporting adaptation and reorganization in a highlydynamic, multidimensional and interdependent interventions (Patton, 2011).How & What did we find in our work?Tamed Literature Search5
    • +What are the strengths of evaluation in Canada? Profession of evaluation has greater presence Professional designation has played a role Increase in use of evaluations across contexts Planning for evaluations as part of program planning is more common Program design is being more frequently informed by evaluationHow & What did we find in our work?Tamed Literature Search6
    • +What are the limitations of evaluation in Canada?There is a lack of clarity in the definition of evaluation Few resources lead to post-implementation studies Rigidity in evaluation within government policies Little research exists on evaluations and what approaches are being usedHow & What did we find in our work?Tamed Literature Search7
    • +Interesting ConsiderationsUnclear definitions of evaluations and approaches Mixed-Methods designs seem to be the most common Move towards online learning modules, webinars, etc. Summative evaluations in government, 2 out of 15developmental evaluations, mostly formative Participatory, utilization and community-based empowermentevaluations most common approachesHow & What did we find in our work?Tamed Literature Search8
    • +Invitation for Audience Input: What do you think about what we are finding? Does this jive with what you expected? Is there literature that we should be looking at? What groups make sense to compare? Government Practitioner, other practitioner, academic9
    • + How & What did we find in our work?Wild West of Social Media Rationale for what socialmedia can offer? Access the perspectives of awide variety of people aroundthe world User-generated content -unstructured and interactive Process Twitter – used#eval, #evaluation, #CES; 3responses LinkedIn – posted in 9groups; 50 unique responsesfrom 37 unique responders Facebook – 1 responseTwitterHow do you define #evaluation? Im helping conductresearch for #CES on different definitions. Results to bepresented @CESToronto2013 #eval”FacebookHow do you define evaluation? Colleagues and I aredoing research for the CES, exploring different definitionsof evaluation. Results will be presented at the CESConference in Toronto this June. For those who followme on twitter or are connected via linked in, myapologies - my responsibility is to see what results I getthrough social media. Please feel free to share the post.LinkedInHow do you define evaluation? Colleagues and I aredoing research for the CES, exploring different definitionsof evaluation. Results will be presented at the CESConference in Toronto this June.When prompted in 1 group followed up with somebackground/clarification in all groups.10
    • + How & What did we find in our work?Wild West of Social Media Top of mind definitions  In-depth discussions of “textbook”definitions  All considerations necessary when trying todevelop a definition Need to consider: What are we trying to define Activities, process, practices, methods, approaches Relative to other fields Relative to user expectations & opinions Evaluand, its context and purpose Evaluator Overall observation: NO ONE SIZE FITS ALL11
    • + How & What did we find in our work?Wild West of Social MediaResults Overview12
    • + A Slightly Different Perspective of our ResultsHow & What did we find in our work?Wild West of Social Media13
    • +Our Interpretation of What it all MeansHow & What did we find in our work?Wild West of Social MediaEssentially, my job doesn’thave a hat. Some jobs havehats. Just look at those hats.Every child you know can tellyou which job goes withwhich hat. I have no hat formy job.Source:Lisa O’Reilly, C.E. -http://lisaoreilly.ca/work/14
    • +Can we can have a hat – or maybe multiple hats?How & What did we find in our work?Wild West of Social MediaSource: The Six Thinking Hats (or modes) from de Bono Thinking Systemshttp://www.debonothinkingsystems.com/tools/6hats.htmThe White Hat: calls for information known or neededThe Red Hat: signifies feelings, hunches and intuitionThe Black Hat: is judgment – the devil’s advocate or why somethingmay not workThe Yellow Hat: symbolizes brightness and optimismThe Green Hat: focuses on creativity: the possibilities, alternativesand new ideasThe Blue Hat: used to manage the thinking process15
    • +Invitation for Audience Input: What do you think about what we are finding? Does this jive with what you expected? If we are going to define evaluation, what should we bedefining: What we do? What we are trying to achieve? Methods? Skills? Other? What do you think, do we have a hat? Or maybe multiple hats?16
    • + Rationale If our products don’t servethe needs of intendeduser, irrelevant Some evidence in socialmedia that users viewsimportant Process Drafted survey based onsocial media concepts 8 closed questions withopportunity for comments Piloted with 2; 5-10minutes Not implementedHow & What did we find in our work?Unanticipated Survey ChallengesSurvey to cover Evaluation – should it make judgments Rate importance of aspects on whichevaluation could draw conclusions, e.g.outcomes achieved, implemented as planned Rate importance of including reasonsfor conducting an evaluation, e.g.learning, inform decisions Systematic process – yes or no Independence/objectivity important?17
    • +How & What did we find in our work?Unanticipated Survey Challenges18Weekly e-mail to CES membersPlease pass on survey link to evaluation usersCES members to evaluation usersPlease answer our surveyUsers answer surveyIt’s as simple as that !
    • +How & What did we find in our work?Unanticipated Survey Challenges19Weekly e-mail to CES membersPlease pass on survey link to evaluation usersCES members to evaluation usersPlease answer our surveyUsers answer surveyOr maybe not
    • +Invitation for Audience Input: How important is the user viewpoint to our work? What is the best way to access the targeted population ofevaluation users? Do we have the right questions ?20
    • +Take Home Message Doing this research is hard Unclear definitions Challenge will be creating a definition that is reflective of allperspectives and that we can have some degree of consensusand how to we go about this? Our intention is to be inclusive and not marginalize the passionwe see in the perspectives we have so far garnered21
    • +Where should we go from here Social Networking Analysis Literature Review Include French language literature First Nations, Metis and Inuit perspective Deeper look into grey literature and Canadian Evaluation Theses Deductive using definitions Evaluation Definition Survey22
    • +Invitation for Audience Discussion: What are the risks associated with choosing one definition? Is a definition necessary? Where do you think we should go next? If time permits: Definition activity What do you notice about your definition? What would be important to build on? What could be left out; what is not relevant to the Canadian context?23
    • + Source A: Patton, M.Q. (1997). Utilization-focusedEvaluation, Sage The systematic collection and analysis of information aboutprogram activities, characteristics, and outcomes to makejudgements about the program, improve programeffectiveness and/or inform decisions about futureprogramming. Program Evaluation involves:(1) The systematic collection and analysis of information(2) Focuses on a broad range of topics(accessibility, comprehensiveness, Integration, cost, efficiency,effectiveness)(3) Is designed for a variety of uses(management, accountability, planning)24
    • + Source B: Fournier, D. (2005). Encyclopaedia ofEvaluation. Also cited by Patton in his UtilizationFocused Evaluation text (2008), 4th edition Evaluation is an applied inquiry process for collecting andsynthesizing evidence that culminates in conclusions aboutthe state of affairs, value, merit, worth, significance or qualityof a program, product, person or plan. Conclusions made inevaluations encompass both an empirical aspect and anormative aspect. It is the value feature that distinguishesevaluation from other types of inquiry such as basic scienceresearch, clinical epidemiology investigative journalism orpublic polling.25
    • + Source C: Yarbrough, Shulha, Hopson &Caruthers, (2011) JCSEE Program EvaluationStandards 3rd edition The systematic investigation of the quality ofprograms, projects, subprograms, subprojects, and/or an of their components or elements, together orsingly for the purposes of decisionmaking, judgements, conclusion, findings, newknowledge, organizational development, andcapacity building in response to the needs ofidentified stakeholders leading to improvementand/or accountability in the users’ program andsystems ultimately contributing to organizational orsocial value.26
    • + Source D: Scriven (1991). EvaluationThesaurus, 3rd Ed. Evaluation refers to the process of determiningmerit, worth, or value of something, or the product ofthat process. . . . The evaluation process normallyinvolves some identification of relevant standards ofmerit, worth, or value; some investigation of theperformance of evaluands on these standards; someon these standards; and some integration orsynthesis of the results to achieve an overallevaluation or set of associated evaluations27
    • + Source E: Preskill & Torres (1999). Evaluativeinquiry for learning in organizations. Sage. Evaluative inquiry is an ongoing process forinvestigating and understanding criticalorganizational issues. It is an approach to learningthat is fully integrated with an organization’s workpractices, and as such, it engenders (a)organization’s members’ interest and ability inexploring critical issues using evaluation logic, (b)organization members’ improvement in evaluativeprocesses, and (c) the personal and professionalgrowth of individuals within the organization28
    • + Source F: Lipsey & Freeman (2004).Evaluation: A Systematic Approach, 7th ed., Sage Program evaluation is the use of social researchmethods to systematically investigate theeffectiveness of social intervention programs. Itdraws on the techniques and concepts of socialscience disciplines and is intended to be useful forimproving programs and informing social actionameliorating social problems.29