Tracking Use of Campaign Evaluation Findings of Two International OrganisationsGlenn O'Neil
1. An evaluation was conducted of two international organizations' communication campaigns from 2009-2010.
2. Four years later, the evaluator interviewed campaign staff to study how the evaluation findings had been used.
3. The evaluator found that use of the findings was mostly non-linear and unanticipated, influenced strongly by internal factors like staff and resources, and that use occurred opportunistically in response to various influences rather than in a planned, linear fashion.
Communicating evaluation findings: challenges and opportunitiesGlenn O'Neil
Four challenges and opportunities to communicating evaluation finding: presentation originally made at a Kampala Evaluation Talk, 17 October 2014, Kampala, Uganda for the The Uganda Evaluation Association as part of the GIZ project on Evaluation Capacity Development in Uganda.
Seven new ways to present evaluation findingsGlenn O'Neil
An overview and practical examples of the new and innovative ways for presenting evaluation findings: Scorecards, summary sheets, multimedia and video reports, blogs, interactive web pages amongst others
Integrating communications in evaluationGlenn O'Neil
How can communications be used to optimally support the evaluation process? What are the myths of using communications for evaluations? Originally presented to the Geneva Evaluation Network and the ILO Evaluation Office, January 2020
BetterEvaluation: A framework for planning evaluationsSimon Hearn
A presentation given at the Institute of Medicine Workshop on Evaluation Methods for Large-Scale, Complex, Multi-National, Global Health Initiatives, January 7, 2014, Wellcome Trust, London.
Innovation Network's Veena Pankaj and ORS Impact's Mel Howlett share dataviz products that can be used throughout the evaluation lifecycle, including theory of change, social network analysis, data placemat, strategic debrief deck, H-form, visual report deck, visual executive summary, and timeline.
The document summarizes research on college students' academic research habits and needs. It found that most students do research online and their most common first steps are using scholarly databases or popular search engines. While most students were satisfied with the current online research process, the majority also found it time-consuming. Students expressed wanting "smart search" features, a guide through the research process stages, and tools to help organize sources and show connections between sources and content areas.
This document provides guidance on designing program evaluations in 3-5 sentences. It discusses clarifying the program's goals and strategy, developing relevant evaluation questions, and selecting an appropriate evaluation design and approach. It also covers identifying appropriate data sources and collection procedures, developing plans to analyze data to allow for valid conclusions, and defining key parts of an evaluation plan such as objectives, information sources, data collection methods, and analysis plans.
Tracking Use of Campaign Evaluation Findings of Two International OrganisationsGlenn O'Neil
1. An evaluation was conducted of two international organizations' communication campaigns from 2009-2010.
2. Four years later, the evaluator interviewed campaign staff to study how the evaluation findings had been used.
3. The evaluator found that use of the findings was mostly non-linear and unanticipated, influenced strongly by internal factors like staff and resources, and that use occurred opportunistically in response to various influences rather than in a planned, linear fashion.
Communicating evaluation findings: challenges and opportunitiesGlenn O'Neil
Four challenges and opportunities to communicating evaluation finding: presentation originally made at a Kampala Evaluation Talk, 17 October 2014, Kampala, Uganda for the The Uganda Evaluation Association as part of the GIZ project on Evaluation Capacity Development in Uganda.
Seven new ways to present evaluation findingsGlenn O'Neil
An overview and practical examples of the new and innovative ways for presenting evaluation findings: Scorecards, summary sheets, multimedia and video reports, blogs, interactive web pages amongst others
Integrating communications in evaluationGlenn O'Neil
How can communications be used to optimally support the evaluation process? What are the myths of using communications for evaluations? Originally presented to the Geneva Evaluation Network and the ILO Evaluation Office, January 2020
BetterEvaluation: A framework for planning evaluationsSimon Hearn
A presentation given at the Institute of Medicine Workshop on Evaluation Methods for Large-Scale, Complex, Multi-National, Global Health Initiatives, January 7, 2014, Wellcome Trust, London.
Innovation Network's Veena Pankaj and ORS Impact's Mel Howlett share dataviz products that can be used throughout the evaluation lifecycle, including theory of change, social network analysis, data placemat, strategic debrief deck, H-form, visual report deck, visual executive summary, and timeline.
The document summarizes research on college students' academic research habits and needs. It found that most students do research online and their most common first steps are using scholarly databases or popular search engines. While most students were satisfied with the current online research process, the majority also found it time-consuming. Students expressed wanting "smart search" features, a guide through the research process stages, and tools to help organize sources and show connections between sources and content areas.
This document provides guidance on designing program evaluations in 3-5 sentences. It discusses clarifying the program's goals and strategy, developing relevant evaluation questions, and selecting an appropriate evaluation design and approach. It also covers identifying appropriate data sources and collection procedures, developing plans to analyze data to allow for valid conclusions, and defining key parts of an evaluation plan such as objectives, information sources, data collection methods, and analysis plans.
The basic steps to program evaluation are to first define the purpose and objectives of the evaluation by identifying stakeholders, budget, timeline and intended outcomes. Next, a plan is created which determines the evaluation questions and selects a model to collect both qualitative and quantitative data from sources like questionnaires and interviews. Finally, the data is analyzed and findings are reported in a final evaluation report.
Seminar Data-Driven Decision Making @ School of Business and Economics & Inst...Mark Graus
Slides of my 15 minute presentation on adopting a more user-centric approach in data-driven personalization. Presented during the joint seminar organized by the data-driven decision making (D3M) research theme at School of Business and Economics and the Institute of Data Science (IDS) at Maastricht University.
Pedagogical and Technological Interventions in Peer Review (Shivers-McNair) #...annshiversmcnair
In this presentation, I discuss a collaboration with Eli Review, teacher-researchers who built a web app to support peer learning through feedback and revision, that led to new features on the app and new research and teaching questions for me.
This evaluation plan aims to determine whether to adopt synchronous communication mechanisms, like video conferencing, to support the "Develop user stories" activity in an online Agile Software Development course. It will survey experts, instructors, and learners from different countries to understand their perspectives on the benefits and challenges of synchronous interaction. A systematic literature review and online questionnaires will be used to gather feedback. The results will help decide whether synchronous tools are feasible given participants' time zones and technological access.
This document presents a framework for understanding how postal contact can be used to gain responses for web surveys. It identifies five key stages in the response process: being motivated to open the mailing, read it, take part in the survey, go online, and complete the questionnaire. Factors like personalization, multiple contact attempts, and device-agnostic design are proposed to positively impact response. Priorities for further research include how to get people to open mailings, what motivates participation, and reducing the effort to respond online. The document concludes by asking discussants for feedback on the identified research priorities and how best practices can be shared with survey agencies.
This document outlines a process to improve the design schedule process. It identifies problems with the current process such as a loss of productivity, information, and quality. Three causes are listed: an ineffective design schedule process, poor communication methods between stakeholders, and a lack of performance measures. Three solutions are proposed: implementing quality improvement processes and measures, creating a family of measures to track metrics, and developing a communication network to address miscommunication. The expected results are increased quality, greater faculty collaboration, utilizing institutional expertise, and increased accountability.
Online assessment and data analytics - Peter Tan - Institute of Technical Edu...Blackboard APAC
Are you spending lots of time conducting and marking formative assessments, tracking the learning progress of your students, and providing early intervention so as to help them learn and achieve better grades? If so, using a Learning Management System (LMS) together with a data analytics tool may help to increase your productivity. In this session, we will cover how Blackboard tools can help you conduct assessments in a paperless manner and automate the marking. You will also learn how data analytics can help you turn raw assessment data into meaningful information which will help you identify the 'at-risk' students that need your extra help, the better ones that need more challenging tasks, and the chapters that may need to be delivered with a different pedagogical approach. Hence, with a robust LMS and a data analytics tool, your quality of teaching and students' learning will help to bring about a higher student success rate.
The document discusses utilization focused evaluation (UFE) which aims to inform decisions by identifying stakeholders who will use the evaluation and involving them. It identifies key elements of UFE including selecting stakeholders who care about the evaluation and can use the results. The approach involves identifying intended users, assessing their interest and power to act on results. Participatory approaches similarly involve stakeholders in design, questions, and interpretation but UFE specifically focuses on decision makers. Weaknesses can include staff turnover reducing use of results, and identifying the correct decision makers is complex.
This document discusses the key steps and considerations for conducting a survey:
1) Determine the purpose and objectives of the survey and what questions need to be asked.
2) Decide who will carry out the different roles for implementing the survey such as supervisors, interviewers, and data entry staff.
3) Plan the logistics of carrying out field work such as sampling approach, survey team structure, materials, and costs.
Using Web 2.0 Technologies to Facilitate Learninglarae9411
This slide show describes a collaborative project between students at Florida A&M University (a historically black university in Tallahassee, Fla.) and Midwestern State University (a predominantly white university in Wichita Falls, Texas) in fall 2008. Students used a wiki, blog, e-mail, social networking sites and a videoconference to create, promote and analyze the results of an online survey (using SurveyMonkey) of college student attitudes toward the 2008 presidential election. The presentation was given in August 2009 at the Association for Education in Journalism & Mass Communication in Boston.
The document discusses the importance of evaluating projects and learning technologies. It provides reasons to evaluate, such as ensuring quality, meeting stakeholder needs, and demonstrating effectiveness. It also lists common stakeholders in evaluation like learners, funders, and educators. Several evaluation methods are described, including checklists, questionnaires, interviews, observation, and analyzing system log data. The document stresses considering objectivity, reliability, practicality and validity when choosing evaluation methods. It prompts the reader to select appropriate methods for their case and consider the pros and cons of different options.
Using Web 2.0 Technologies to Facilitate Learninglarae9411
Presentation on the collaboration between students at Florida A&M University in Tallahassee, Fla., and Midwestern State University in Wichita Falls, Texas. Students used a wiki, blog, e-mail and videoconference to create, promote and analyze the results of an online survey about the 2008 presidential election. This presentation was given at the 2009 AEJMC national conference.
This document summarizes a collaboration project between Florida A&M University and Midwestern State University that used Web 2.0 technologies like blogs, wikis, and surveys to investigate student political beliefs across distance. Students created a survey on college student elections that received 900 responses. The data showed that students found wikis more effective than blogs for collaboration, but lacked comfort with these tools. To improve future projects, the researchers concluded that instructors need to design technology use around learning goals and provide more scaffolding to help students apply tools in other contexts.
Presentation about collaborative social media projects between students at Florida A&M University in Tallahassee, Fla., and at Midwestern State University in Wichita Falls, Texas. Students used wiki, blog, e-mail and videoconferencing to create, promote and analyze results of an online survey about the 2008 presidential election.
Developing an evaluation strategy to gain insights into the ROER4D multi-nati...SarahG_SS
Presentation at the OE Global Conference held in Banff, Canada in April 2015. This presentation introduces the development of the evaluation strategy, using Utilization Focused Evaluation (UFE), for the ROER4D Project (http://roer4d.org/).
You can access the abstract here: http://conference.oeconsortium.org/2015/presentation/developing-an-evaluation-strategy-to-gain-insights-into-a-multi-national-project-roer4d/
The Development and Usability Evaluation of a Standards-Based Grading Tool fo...Alaa Sadik
The document summarizes a study that developed and evaluated a standards-based grading tool called RealGrade for faculty at Sultan Qaboos University in Oman. The tool aimed to streamline grading practices and facilitate standards-based assessment. Researchers conducted interviews and focus groups to determine faculty needs, then developed RealGrade to automate grading, enable qualitative assessment, and generate reports. 116 faculty tested the tool and completed a questionnaire finding RealGrade effective, efficient, and satisfactory overall, with more computer-experienced and less experienced instructors viewing it most positively. Interviews provided additional insights. Researchers concluded RealGrade supports standards-based assessment but report cards need improvement and more research is needed to address instructor differences.
The document is a certificate from Johns Hopkins University congratulating Frank Hasbani for successfully completing the Data Science Specialization online program. The specialization consisted of 10 courses covering the concepts and tools for data science, including R programming, getting and cleaning data, exploratory data analysis, reproducible research, statistical inference, regression models, practical machine learning, developing data products, and a capstone project. The specialization trained participants in using data science tools, analyzing complex problems, managing large datasets, applying statistical principles, creating visualizations, building and evaluating machine learning models, and publishing reproducible analyses.
Causes of mode effect on survey management 2011natcenslides
This document provides an overview of a one day symposium on causes of mode effects on survey measurement. The symposium focused on how the mode of data collection can affect survey responses. It explored this topic through several presentations and discussions on how cognitive interviewing, visual/aural stimuli, interviewers, question format optimization, and question design affect comparability of responses across modes. The goal was to provide practical advice on improving portability of survey questions across different modes of data collection like telephone, web, and paper surveys.
Use of evaluation findings; types and influences Glenn O'Neil
How are evaluation findings actually used? This presentation examines what are the different types of evaluation use and what influences use. Originally presented at the Swiss National Health Promotion Conference, 19 January 2017
Evaluating Communication Programmes, Products and Campaigns: Training workshopGlenn O'Neil
A one day workshop on evaluating communication programmes, products and campaigns. The main steps and methods are covered with real life examples given. This workshop was originally conducted by Glenn O'Neil of Owl RE for Gellis Communications in Brussels in October
A one day workshop on surveys for communicators. Increasingly communicators need the ability to evaluate their activities and know what their audiences think and desire. Being able to design and set-up online surveys is a key tool for communicators for soliciting feedback and interacting with audiences. These slides from the workshop will take participants from the design to the analysis stage. Workshop originally conducted on 14 June 2013 for the Geneva Communicators Network.
The basic steps to program evaluation are to first define the purpose and objectives of the evaluation by identifying stakeholders, budget, timeline and intended outcomes. Next, a plan is created which determines the evaluation questions and selects a model to collect both qualitative and quantitative data from sources like questionnaires and interviews. Finally, the data is analyzed and findings are reported in a final evaluation report.
Seminar Data-Driven Decision Making @ School of Business and Economics & Inst...Mark Graus
Slides of my 15 minute presentation on adopting a more user-centric approach in data-driven personalization. Presented during the joint seminar organized by the data-driven decision making (D3M) research theme at School of Business and Economics and the Institute of Data Science (IDS) at Maastricht University.
Pedagogical and Technological Interventions in Peer Review (Shivers-McNair) #...annshiversmcnair
In this presentation, I discuss a collaboration with Eli Review, teacher-researchers who built a web app to support peer learning through feedback and revision, that led to new features on the app and new research and teaching questions for me.
This evaluation plan aims to determine whether to adopt synchronous communication mechanisms, like video conferencing, to support the "Develop user stories" activity in an online Agile Software Development course. It will survey experts, instructors, and learners from different countries to understand their perspectives on the benefits and challenges of synchronous interaction. A systematic literature review and online questionnaires will be used to gather feedback. The results will help decide whether synchronous tools are feasible given participants' time zones and technological access.
This document presents a framework for understanding how postal contact can be used to gain responses for web surveys. It identifies five key stages in the response process: being motivated to open the mailing, read it, take part in the survey, go online, and complete the questionnaire. Factors like personalization, multiple contact attempts, and device-agnostic design are proposed to positively impact response. Priorities for further research include how to get people to open mailings, what motivates participation, and reducing the effort to respond online. The document concludes by asking discussants for feedback on the identified research priorities and how best practices can be shared with survey agencies.
This document outlines a process to improve the design schedule process. It identifies problems with the current process such as a loss of productivity, information, and quality. Three causes are listed: an ineffective design schedule process, poor communication methods between stakeholders, and a lack of performance measures. Three solutions are proposed: implementing quality improvement processes and measures, creating a family of measures to track metrics, and developing a communication network to address miscommunication. The expected results are increased quality, greater faculty collaboration, utilizing institutional expertise, and increased accountability.
Online assessment and data analytics - Peter Tan - Institute of Technical Edu...Blackboard APAC
Are you spending lots of time conducting and marking formative assessments, tracking the learning progress of your students, and providing early intervention so as to help them learn and achieve better grades? If so, using a Learning Management System (LMS) together with a data analytics tool may help to increase your productivity. In this session, we will cover how Blackboard tools can help you conduct assessments in a paperless manner and automate the marking. You will also learn how data analytics can help you turn raw assessment data into meaningful information which will help you identify the 'at-risk' students that need your extra help, the better ones that need more challenging tasks, and the chapters that may need to be delivered with a different pedagogical approach. Hence, with a robust LMS and a data analytics tool, your quality of teaching and students' learning will help to bring about a higher student success rate.
The document discusses utilization focused evaluation (UFE) which aims to inform decisions by identifying stakeholders who will use the evaluation and involving them. It identifies key elements of UFE including selecting stakeholders who care about the evaluation and can use the results. The approach involves identifying intended users, assessing their interest and power to act on results. Participatory approaches similarly involve stakeholders in design, questions, and interpretation but UFE specifically focuses on decision makers. Weaknesses can include staff turnover reducing use of results, and identifying the correct decision makers is complex.
This document discusses the key steps and considerations for conducting a survey:
1) Determine the purpose and objectives of the survey and what questions need to be asked.
2) Decide who will carry out the different roles for implementing the survey such as supervisors, interviewers, and data entry staff.
3) Plan the logistics of carrying out field work such as sampling approach, survey team structure, materials, and costs.
Using Web 2.0 Technologies to Facilitate Learninglarae9411
This slide show describes a collaborative project between students at Florida A&M University (a historically black university in Tallahassee, Fla.) and Midwestern State University (a predominantly white university in Wichita Falls, Texas) in fall 2008. Students used a wiki, blog, e-mail, social networking sites and a videoconference to create, promote and analyze the results of an online survey (using SurveyMonkey) of college student attitudes toward the 2008 presidential election. The presentation was given in August 2009 at the Association for Education in Journalism & Mass Communication in Boston.
The document discusses the importance of evaluating projects and learning technologies. It provides reasons to evaluate, such as ensuring quality, meeting stakeholder needs, and demonstrating effectiveness. It also lists common stakeholders in evaluation like learners, funders, and educators. Several evaluation methods are described, including checklists, questionnaires, interviews, observation, and analyzing system log data. The document stresses considering objectivity, reliability, practicality and validity when choosing evaluation methods. It prompts the reader to select appropriate methods for their case and consider the pros and cons of different options.
Using Web 2.0 Technologies to Facilitate Learninglarae9411
Presentation on the collaboration between students at Florida A&M University in Tallahassee, Fla., and Midwestern State University in Wichita Falls, Texas. Students used a wiki, blog, e-mail and videoconference to create, promote and analyze the results of an online survey about the 2008 presidential election. This presentation was given at the 2009 AEJMC national conference.
This document summarizes a collaboration project between Florida A&M University and Midwestern State University that used Web 2.0 technologies like blogs, wikis, and surveys to investigate student political beliefs across distance. Students created a survey on college student elections that received 900 responses. The data showed that students found wikis more effective than blogs for collaboration, but lacked comfort with these tools. To improve future projects, the researchers concluded that instructors need to design technology use around learning goals and provide more scaffolding to help students apply tools in other contexts.
Presentation about collaborative social media projects between students at Florida A&M University in Tallahassee, Fla., and at Midwestern State University in Wichita Falls, Texas. Students used wiki, blog, e-mail and videoconferencing to create, promote and analyze results of an online survey about the 2008 presidential election.
Developing an evaluation strategy to gain insights into the ROER4D multi-nati...SarahG_SS
Presentation at the OE Global Conference held in Banff, Canada in April 2015. This presentation introduces the development of the evaluation strategy, using Utilization Focused Evaluation (UFE), for the ROER4D Project (http://roer4d.org/).
You can access the abstract here: http://conference.oeconsortium.org/2015/presentation/developing-an-evaluation-strategy-to-gain-insights-into-a-multi-national-project-roer4d/
The Development and Usability Evaluation of a Standards-Based Grading Tool fo...Alaa Sadik
The document summarizes a study that developed and evaluated a standards-based grading tool called RealGrade for faculty at Sultan Qaboos University in Oman. The tool aimed to streamline grading practices and facilitate standards-based assessment. Researchers conducted interviews and focus groups to determine faculty needs, then developed RealGrade to automate grading, enable qualitative assessment, and generate reports. 116 faculty tested the tool and completed a questionnaire finding RealGrade effective, efficient, and satisfactory overall, with more computer-experienced and less experienced instructors viewing it most positively. Interviews provided additional insights. Researchers concluded RealGrade supports standards-based assessment but report cards need improvement and more research is needed to address instructor differences.
The document is a certificate from Johns Hopkins University congratulating Frank Hasbani for successfully completing the Data Science Specialization online program. The specialization consisted of 10 courses covering the concepts and tools for data science, including R programming, getting and cleaning data, exploratory data analysis, reproducible research, statistical inference, regression models, practical machine learning, developing data products, and a capstone project. The specialization trained participants in using data science tools, analyzing complex problems, managing large datasets, applying statistical principles, creating visualizations, building and evaluating machine learning models, and publishing reproducible analyses.
Causes of mode effect on survey management 2011natcenslides
This document provides an overview of a one day symposium on causes of mode effects on survey measurement. The symposium focused on how the mode of data collection can affect survey responses. It explored this topic through several presentations and discussions on how cognitive interviewing, visual/aural stimuli, interviewers, question format optimization, and question design affect comparability of responses across modes. The goal was to provide practical advice on improving portability of survey questions across different modes of data collection like telephone, web, and paper surveys.
Use of evaluation findings; types and influences Glenn O'Neil
How are evaluation findings actually used? This presentation examines what are the different types of evaluation use and what influences use. Originally presented at the Swiss National Health Promotion Conference, 19 January 2017
Evaluating Communication Programmes, Products and Campaigns: Training workshopGlenn O'Neil
A one day workshop on evaluating communication programmes, products and campaigns. The main steps and methods are covered with real life examples given. This workshop was originally conducted by Glenn O'Neil of Owl RE for Gellis Communications in Brussels in October
A one day workshop on surveys for communicators. Increasingly communicators need the ability to evaluate their activities and know what their audiences think and desire. Being able to design and set-up online surveys is a key tool for communicators for soliciting feedback and interacting with audiences. These slides from the workshop will take participants from the design to the analysis stage. Workshop originally conducted on 14 June 2013 for the Geneva Communicators Network.
Crowdsourcing for #internalcomms. Creating shared insight through curious met...Meaning Business
Great discoveries are driven by curiosity: capturing and applying insight takes a methodical approach. Collaborative approaches can provide communicators information from the front-line that drives greater business knowledge that can be turned into performance-driving insight. With employees accustomed to rating, sharing and peer-based recommendations outside work, crowd-sourcing is an important approach inside the company. This interactive session will explore methods that generate more than an employee wish list.
Two-way communication in a networked world: user experience (UX), appreciative inquiry and crowd sourcing
How to choose methods that build empowerment and create insight through ownership
Working agile – gathering, rating and 'iterating' ideas in environments where change is constant
This session of the 2013 Melcrum Summit was developed by Jonathan Champ in collaboration with Adrian Cropley ABC.
Insights into global advocacy: Oxfam's GROW campaign Glenn O'Neil
The document summarizes Oxfam's global GROW campaign from 2011-2015. The campaign had 5 objectives: helping grow social movements, stopping land and water grabs, reaching a climate change deal, investing in small-scale food producers, and responding to food crises. It was active in 50 countries through activities like lobbying, media work, and public mobilization. An evaluation found facilitating factors were creating a consistent brand, engaging Southern partners, and combining policy work with public campaigns. Hindering factors included a difficult start, inability to build a global movement, and lack of Northern coalition-building.
The revolution will be televised internally: enterprise video for internal co...Meaning Business
The document discusses how video is becoming an increasingly important tool for internal communication as employee expectations change. It notes that 55% of employees expect to see video as part of the internal communication mix, and that 92% of communicators see visual communication as important. The document also discusses how different types of video content are more or less suitable for different levels of employee involvement and understanding regarding organizational changes. It emphasizes that internal video should strive for consistency, enable conversation, and help employees make personal connections.
The Shorter COMMS plan is a simple method for getting better outcomes from communication. Created by Meaning Business, it is available free under a Creative Commons license.
The End of the Story - Transmedia Storytelling for Corporate CommunicationMeaning Business
IABC14 Presenation on transmedia story worlds for corporate communication. To listen to the audio of this presentation, check out Soundcloud: https://soundcloud.com/meaningbusiness/meaningbusiness-storyworld-presentation-iabc14/s-wtLwN
March to Standards: #SMMStandards Progress and RoadmapTim Marklein
Presentation by @tmarklein and @kdpaine outlining progress, initial deliverables and a roadmap from the #SMMStandards Coalition and Conclave work, presented June 15, 2012 at the 4th European Summit on Measurement hosted by AMEC in Dublin -- includes contributions from cross-industry collaboration including AMEC, Council of PR Firms, Institute for Public Relations, PRSA, Global Alliance, CIPR, IABC, SNCR, Web Analytics Association, WOMMA and key clients Dell, Ford, Procter & Gamble, SAS, Southwest Airlines, Thomson Reuters
The Sustainable Development Goal #7 to ensure access to affordable, reliable, sustainable and modern energy for all by 2030 has brought about a renewed focus on the 1.1 billion people around the world without any access to electricity. The increasing commercial viability of off-grid technologies provides an effective and scalable complement to traditional electricity grid expansion, and the opportunity to rapidly improve the livelihoods of millions across the globe.
Our panel of experts discussed the commercial viability and potential of off-grid technologies. Speakers from the World Bank Group, the private sector and non-profit sector shared their perspectives, drawing on their experience and knowledge of current sector trends. The event featured the findings and lessons of a recent IEG study: Reliable and Affordable Off-Grid Electricity Services for the Poor: Lessons from World Bank Group Experience.
This learning event was jointly hosted by the Independent Evaluation Group, the World Bank’s Energy & Extractives Global Practice, and the International Finance Corporation’s Clean Energy and Resource Efficiency Group.
Evaluation for PR and Event Management AgenciesAbel Ahing
Evaluation form for assessing public relations and event management agencies in their pitch for your job. This is a useful tool which you can use during pitching presentations to
The document discusses how evaluation systems, like the Michelin Guide, can help development institutions improve results.
[1] The Michelin Guide is a trusted evaluation system that motivates restaurants to constantly monitor and improve quality. [2] Independent and self-evaluation systems play a similar role for development organizations in monitoring progress, identifying issues, and adapting over time. [3] A culture of self-evaluation from project start to finish is essential for success, just as internal monitoring incentivizes quality control for Michelin-rated restaurants.
This presentation on evaluating public relations campaigns is an excerpt from a presentation conducted by Shrita Sterlin of Penn Strategies. Shrita conducted this presentation at a forum held by the Center for Nonprofit Success on November 3, 2011 in Washington, DC. The presentation explores best practices for creating data-driven public relations campaigns in nonprofit organizations; provides tips for quantifying social change efforts; and demonstrates ways to measure the process and results of public relations campaigns.
Class 6 research quality in qualitative methods 3 2-17tjcarter
This document discusses key ethical issues and methodological considerations for conducting Scholarship of Teaching and Learning (SoTL) research. It outlines assumptions of qualitative research designs, including that they seek to understand meaning and experience rather than generate generalized knowledge. It also discusses eight stages of formative research to generate options and assess interventions. The document emphasizes rigor in qualitative research through credibility, transferability, dependability, and confirmability. It explores mixed methods approaches and priorities for integrating qualitative and quantitative methods.
This document summarizes the results of a five-phase research study conducted on employee donations to the Brookings Area United Way. Phase 1 involved reviewing prior literature and identifying research questions. Phase 2 covered planning the research design and methodology. Phase 3 addressed the specific research methods used. Phase 4 analyzed and reported the findings from interviews and questionnaires, which identified key themes like busy schedules and length of employment affecting donations. Phase 5 involved reconceptualizing the themes and findings. The document concludes with recommendations to increase donations by focusing on younger age groups and improving public relations and communication efforts.
Community engagement - what constitutes successcontentli
This document discusses evaluating community engagement programs. It explains that evaluation involves systematically collecting information about a program's activities and outcomes to track progress, make judgements, and improve effectiveness. For community engagement specifically, evaluation can determine what worked well or not, if engagement met its objectives, and if it enhanced knowledge and decision-making. The document recommends clarifying a program's logic, outcomes, and purpose of evaluation with stakeholders. It also suggests establishing performance indicators and methods for collecting and analyzing information to both manage programs adaptively and use findings.
Presented by James Little (freelance and University of Sheffield) at The Open University, Milton Keynes, UK on 15 June 2017. This presentation formed part of the FutureLearn Academic Network section (FLAN Day) of the 38th Computers and Learning Research Group (CALRG) conference. For full details, see http://cloudworks.ac.uk/cloudscape/view/3004
This document discusses various qualitative data collection methods used in descriptive research, including observations, interviews, questionnaires, surveys, and examining records. It provides details on how to conduct interviews and design questionnaires, as well as the advantages and disadvantages of different techniques. Specifically, it outlines steps for structured interviews, factors to consider in choosing data collection methods, and how to write questions to avoid biases and ensure understandability.
Qualitative Methods Course: Moving from Afterthought to ForethoughtMEASURE Evaluation
This document summarizes a qualitative methods course developed by MEASURE Evaluation and UNC. It provides:
1) An overview of the course which aims to enhance participants' skills in conceptualizing, designing, and managing qualitative evaluation methods through 11 sessions over 7 days of instruction and practical activities.
2) Details on course content including sessions on qualitative paradigms, question development, data collection/analysis, and quality standards. Teaching methods incorporate discussion, presentations, group work and a case study.
3) Evaluation methods for the course including pre/post-tests, daily evaluations, and a final evaluation to measure success and identify opportunities for improvement.
Evaluability Assessments and Choice of Evaluation MethodsDebbie_at_IDS
The document discusses evaluability assessments (EAs) and how they can inform the choice of evaluation methods. Key points:
- EAs examine a project's design, available information, and context to determine if and how an evaluation could be conducted. They help ensure evaluations are useful and feasible.
- Common EA steps include reviewing documentation, engaging stakeholders, and making recommendations about a project's logic, monitoring systems, and potential evaluation approaches.
- Choosing evaluation methods depends on the EA results as well as the evaluation's purpose, required credibility, complexity of the intervention, and available resources. Methods like experiments provide strong evidence of impact but are difficult to implement.
- EAs improve evaluation quality by engaging
Qualitative Methods Course: Moving from Afterthought to ForethoughtMEASURE Evaluation
This document provides an overview of an innovative qualitative methods course for rigorous evaluation. The course was developed by a curriculum advisory committee and piloted with 28 participants from 10 countries. It aims to enhance participants' abilities to conceptualize, design, and manage qualitative evaluation. The course covers major concepts, approaches to qualitative evaluation questions, methods, analysis, standards, and ethics. It uses varied teaching methods including discussions, presentations, and activities like developing a short evaluation protocol. Challenges included balancing theory and practice, integrating gender, and meeting participant needs. Pilot evaluations found the content and facilitation were strong but that timeline, hotel, and data analysis instruction could be improved.
1. The document discusses two types of feedback loops in performance management - funder-partner feedback loops and service delivery feedback loops.
2. Funder-partner feedback loops involve a 7-step cycle between funders and partners to refine programs, assessments, data collection and analysis, and recommendations.
3. Service delivery feedback loops also follow a 7-step cycle but occur internally between an organization's planning, curricula development, assessment, data management, analysis, reporting, and communication steps.
The document discusses evaluation in public relations practice. It defines evaluation and outlines some key principles of evaluation according to Patton, including that evaluation is research-based, looks at both inputs and outcomes, is dependent on user needs and context, can examine both short and long-term impacts, allows for comparisons, and takes a multifaceted approach. Several case studies are presented, including one on a campaign to increase dog adoptions by teaching shelter dogs to drive miniature cars. Evaluation of the campaign found it increased awareness of the organization's partnership and positively changed perceptions of shelter dog behavior.
Master Program Organization Studies Tilburg UniversityRob Jansen
This presentation was used to introduce the Master's Program Organization Studies (including the Extended Master). This Master is offered at Tilburg University, the Netherlands, and is fully in English. These slides are from the session of the 10th of March. Want to know more? Mail r.j.g.jansen@tilburguniversity.edu
This document summarizes the findings of a 2013 survey of university provosts regarding student learning outcomes assessment practices. The major findings were: 1) stated learning outcomes are now common, 2) more assessment is being conducted using expanded tools, 3) accreditation drives assessment, 4) assessment data is used more for accreditation than improvement, and 5) faculty engagement is key to effective assessment but remains a challenge. The implications discussed include focusing assessment on classroom practices, recognizing faculty assessment work, and using assessment results for improvement over compliance.
The document discusses developing a research agenda for impact evaluation of development programs. It proposes that the agenda should:
1) Cover different types and purposes of evaluations, questions addressed, users, and those conducting evaluations.
2) Be developed through consultation with various stakeholders and review of existing documentation and examples.
3) Include different types of research like documenting current practices, trials of methods, and longitudinal studies of impact evaluations.
4) Address important questions like how to involve communities and accommodate different views of evidence, and how to represent complex interventions and identify unintended impacts. Support is needed to develop the agenda through legitimate processes and interdisciplinary cooperation.
This document provides an overview of a presentation given by Lena Etuk on why measuring social impact is important. It discusses key terms related to impact measurement like outputs, outcomes and impact. Measuring social impact is important to understand if interventions are making a difference and having their intended effects. The presentation outlines the steps in the impact measurement cycle, including understanding needs, developing a program model and logic map, creating an outcome measurement framework, developing data collection and analysis plans, and implementing measurement. The goal is to learn from measurement to improve programs and demonstrate their value.
This document discusses developing a research agenda for impact evaluation in development. It argues the agenda needs to address more than just causal inference challenges, and should cover all aspects of impact evaluation practice. This includes issues like values clarification, measurement, synthesis, and managing joint projects. The research agenda also needs to recognize development that goes beyond discrete projects to include partnerships and community involvement. Developing the agenda requires consultation, identifying gaps, and reviewing various types of research needed like documenting practice, positive deviance studies, and longitudinal studies. Some example research questions are provided.
This document presents the business case for a national mentoring program for evaluators in Canada. It summarizes the objectives, background, methodology, findings and conclusions of the Core Mentoring Working Group. There is a demonstrated demand for mentoring among evaluators based on multiple surveys. The advantages of mentoring for both mentees and mentors were identified. An effective program would have various dimensions such as different types of mentoring relationships, effective matching processes, and support structures. Issues such as the mentoring lifecycle, risks, and costs need to be considered when developing the program. The working group proposes a national online mentoring service and outlines next steps to pilot and implement the program.
Similar to Communication evaluation: Challenges and complexities (20)
Evaluating Advocacy: Challenges, Methodologies and SolutionsGlenn O'Neil
This document discusses challenges, methodologies, and solutions for evaluating advocacy efforts. It begins by defining advocacy and distinguishing it from other types of campaigns. Key challenges include focusing on activities rather than outcomes and proving impact. The document recommends understanding the desired changes, monitoring progress, selecting appropriate evaluation methods, estimating influence on changes, and sharing lessons learned. A variety of evaluation methods are described, from stakeholder interviews to contribution analysis. The goal is to integrate evaluation into advocacy strategies to continually improve efforts and demonstrate successes.
Humanitarian advocacy aims to influence policies and actions that better address the needs of vulnerable populations. It encompasses efforts made before, during, and after crises to protect rights and access to assistance. Advocacy goals include ensuring respect for humanitarian principles, protecting affected communities, and supporting an effective humanitarian system. Advocacy approaches can be direct with policymakers or indirect by building public support. Strategies consider objectives, target audiences, appropriate messages and tactics, and monitoring frameworks. Challenges to advocacy include balancing operational risks with speaking out, and representing population needs amid crowded policy environments.
Short presentation on conference evaluation presented to the Geneva Evaluation Network by Laetitia Lienart of IAS and Glenn O'Neil of Owl RE on 16 March 2011
The survey of Lift 2010 conference participants found that:
1) Overall satisfaction was high, with 54% rating it as "good" and 19% as "excellent".
2) The greatest benefits for attendees were networking and inspiration from presentations on new technologies.
3) Attendees would like more inspiring presentations, interactive workshops, and participant discussion at future conferences.
4) Most attendees said they would attend and recommend Lift 2011, though fewer said the conference was worth what they paid compared to previous years.
The survey of LIFT Asia 2009 conference participants received 61 responses, a 14% response rate. [1] Overall ratings of the conference were positive, though slightly lower than the previous year. [2] Most participants felt the conference provided interesting information and influenced their thinking about emerging technologies, though fewer agreed it was relevant to their work. [3] All session formats were rated lower than other LIFT conferences. Most participants said they would attend and recommend the next LIFT conference.
The document discusses the importance of evaluating communications activities to determine their effectiveness and efficiency in achieving goals such as changing knowledge, attitudes, and behaviors, and provides examples of evaluation methods for media campaigns, events, and products. Key points are that evaluation should have clear objectives and indicators, start with small-scale tests, and focus on actual results over creative strategies.
Monitoring Health for the SDGs - Global Health Statistics 2024 - WHOChristina Parmionova
The 2024 World Health Statistics edition reviews more than 50 health-related indicators from the Sustainable Development Goals and WHO’s Thirteenth General Programme of Work. It also highlights the findings from the Global health estimates 2021, notably the impact of the COVID-19 pandemic on life expectancy and healthy life expectancy.
The Antyodaya Saral Haryana Portal is a pioneering initiative by the Government of Haryana aimed at providing citizens with seamless access to a wide range of government services
AHMR is an interdisciplinary peer-reviewed online journal created to encourage and facilitate the study of all aspects (socio-economic, political, legislative and developmental) of Human Mobility in Africa. Through the publication of original research, policy discussions and evidence research papers AHMR provides a comprehensive forum devoted exclusively to the analysis of contemporaneous trends, migration patterns and some of the most important migration-related issues.
Contributi dei parlamentari del PD - Contributi L. 3/2019Partito democratico
DI SEGUITO SONO PUBBLICATI, AI SENSI DELL'ART. 11 DELLA LEGGE N. 3/2019, GLI IMPORTI RICEVUTI DALL'ENTRATA IN VIGORE DELLA SUDDETTA NORMA (31/01/2019) E FINO AL MESE SOLARE ANTECEDENTE QUELLO DELLA PUBBLICAZIONE SUL PRESENTE SITO
Food safety, prepare for the unexpected - So what can be done in order to be ready to address food safety, food Consumers, food producers and manufacturers, food transporters, food businesses, food retailers can ...
Donate to charity during this holiday seasonSERUDS INDIA
For people who have money and are philanthropic, there are infinite opportunities to gift a needy person or child a Merry Christmas. Even if you are living on a shoestring budget, you will be surprised at how much you can do.
Donate Us
https://serudsindia.org/how-to-donate-to-charity-during-this-holiday-season/
#charityforchildren, #donateforchildren, #donateclothesforchildren, #donatebooksforchildren, #donatetoysforchildren, #sponsorforchildren, #sponsorclothesforchildren, #sponsorbooksforchildren, #sponsortoysforchildren, #seruds, #kurnool
A Guide to AI for Smarter Nonprofits - Dr. Cori Faklaris, UNC CharlotteCori Faklaris
Working with data is a challenge for many organizations. Nonprofits in particular may need to collect and analyze sensitive, incomplete, and/or biased historical data about people. In this talk, Dr. Cori Faklaris of UNC Charlotte provides an overview of current AI capabilities and weaknesses to consider when integrating current AI technologies into the data workflow. The talk is organized around three takeaways: (1) For better or sometimes worse, AI provides you with “infinite interns.” (2) Give people permission & guardrails to learn what works with these “interns” and what doesn’t. (3) Create a roadmap for adding in more AI to assist nonprofit work, along with strategies for bias mitigation.
RFP for Reno's Community Assistance CenterThis Is Reno
Property appraisals completed in May for downtown Reno’s Community Assistance and Triage Centers (CAC) reveal that repairing the buildings to bring them back into service would cost an estimated $10.1 million—nearly four times the amount previously reported by city staff.
Jennifer Schaus and Associates hosts a complimentary webinar series on The FAR in 2024. Join the webinars on Wednesdays and Fridays at noon, eastern.
Recordings are on YouTube and the company website.
https://www.youtube.com/@jenniferschaus/videos
This report explores the significance of border towns and spaces for strengthening responses to young people on the move. In particular it explores the linkages of young people to local service centres with the aim of further developing service, protection, and support strategies for migrant children in border areas across the region. The report is based on a small-scale fieldwork study in the border towns of Chipata and Katete in Zambia conducted in July 2023. Border towns and spaces provide a rich source of information about issues related to the informal or irregular movement of young people across borders, including smuggling and trafficking. They can help build a picture of the nature and scope of the type of movement young migrants undertake and also the forms of protection available to them. Border towns and spaces also provide a lens through which we can better understand the vulnerabilities of young people on the move and, critically, the strategies they use to navigate challenges and access support.
The findings in this report highlight some of the key factors shaping the experiences and vulnerabilities of young people on the move – particularly their proximity to border spaces and how this affects the risks that they face. The report describes strategies that young people on the move employ to remain below the radar of visibility to state and non-state actors due to fear of arrest, detention, and deportation while also trying to keep themselves safe and access support in border towns. These strategies of (in)visibility provide a way to protect themselves yet at the same time also heighten some of the risks young people face as their vulnerabilities are not always recognised by those who could offer support.
In this report we show that the realities and challenges of life and migration in this region and in Zambia need to be better understood for support to be strengthened and tuned to meet the specific needs of young people on the move. This includes understanding the role of state and non-state stakeholders, the impact of laws and policies and, critically, the experiences of the young people themselves. We provide recommendations for immediate action, recommendations for programming to support young people on the move in the two towns that would reduce risk for young people in this area, and recommendations for longer term policy advocacy.
8. 8
Use
Methodology
Implementation
Findings
2. Conceptualisation
• Communication goals and
ambitions defined?
• Matched to design and
methods?
• Organisational model used?
• Expectations of staff?
1. Pre-conditions
• Evaluation integrated in
communication function?
• Evaluation incorporated in
job and project descriptions?
• Support of evaluation
policies and institution?
• Availability of budget?
3. Approach
• Staff participation?
• Adapt to context?
• Diverse methods?
• Flexibility to change?
• Confidence of staff?
4. Finding value
• Access to evaluation
findings?
• Contribute to staff
knowledge?
• Support more efficient
communications?
• Input into future goals and
ambitions?
Evaluation
of value &
used
9. However beautiful the strategy, you
should occasionally look at the results
Winston Churchill