Via Evaluation's Jessica Weitzel and Caroline Taggart give you the tools and techniques for maximizing the usefulness of data that most organizations already collect, or could easily begin to collect.
More information: viaevaluation.com
This presentation has a vivid description of the basics of doing a program evaluation, with detailed explanation of the " Log Frame work " ( LFA) with practical example from the CLICS project. This presentation also includes the CDC framework for evaluation of program.
N.B: Kindly open the ppt in slide share mode to fully use all the animations wheresoever made.
Brooke Shelley's 2019 AEA presentation about the importance of collaboration in creating effective program evaluations that help ensure project success.
This presentation has a vivid description of the basics of doing a program evaluation, with detailed explanation of the " Log Frame work " ( LFA) with practical example from the CLICS project. This presentation also includes the CDC framework for evaluation of program.
N.B: Kindly open the ppt in slide share mode to fully use all the animations wheresoever made.
Brooke Shelley's 2019 AEA presentation about the importance of collaboration in creating effective program evaluations that help ensure project success.
Two Examples of Program Planning, Monitoring and EvaluationMEASURE Evaluation
Presented by Laili Irani, Senior Policy Analyst for the Population Reference Bureau, as part of the Measuring Success Toolkit webinar in September 2012.
Learn to create a program logic model. Designed for Cooperative Extension Service professionals providing university outreach programs. Logic models are a mainstay in the program development process for community-based, outreach programs.
MONITORING & EVALUATION OF EXTENSION PROGRAMMESAyush Mishra
MONITORING & EVALUATION OF EXTENSION PROGRAMMES. HIGHLIGHTS EXTENSION PROGRAMME PLANNING, MONITORING AND EVALUATION OF PROJECTS, STEPS IN PROGRAM PLANNING ETC.
United Way or Erie County - Programs, Program Monitoring and Evaluation, and ...Via Evaluation
Caroline Taggart, Senior Evaluator, was invited by the United Way of Buffalo & Erie County to present at the organization’s Board Leadership Training program. Caroline’s presentation covered the importance and general tenets of Program Monitoring and Evaluation, with an emphasis on questions non-profit organization’s Board Members can ask to encourage their organization’s engagement in these activities to ensure quality program delivery and maximum impact.
Two Examples of Program Planning, Monitoring and EvaluationMEASURE Evaluation
Presented by Laili Irani, Senior Policy Analyst for the Population Reference Bureau, as part of the Measuring Success Toolkit webinar in September 2012.
Learn to create a program logic model. Designed for Cooperative Extension Service professionals providing university outreach programs. Logic models are a mainstay in the program development process for community-based, outreach programs.
MONITORING & EVALUATION OF EXTENSION PROGRAMMESAyush Mishra
MONITORING & EVALUATION OF EXTENSION PROGRAMMES. HIGHLIGHTS EXTENSION PROGRAMME PLANNING, MONITORING AND EVALUATION OF PROJECTS, STEPS IN PROGRAM PLANNING ETC.
United Way or Erie County - Programs, Program Monitoring and Evaluation, and ...Via Evaluation
Caroline Taggart, Senior Evaluator, was invited by the United Way of Buffalo & Erie County to present at the organization’s Board Leadership Training program. Caroline’s presentation covered the importance and general tenets of Program Monitoring and Evaluation, with an emphasis on questions non-profit organization’s Board Members can ask to encourage their organization’s engagement in these activities to ensure quality program delivery and maximum impact.
Performance Management for Nonprofits: Simplifying and Maximizing Organizati...Community IT Innovators
Get introduced to the tools necessary to optimize your organization’s current data, enabling you to turn data into information to tell the story of the organization’s impact in a powerful way. Contact Karen Finn of Results Leadership Group and/or Katherine Mowers of Community IT Innovators to explore how you can simplify and maximize your organization's impact data.
This presentation includes:
1. An overview of Results-based Accountability and an approach for identifying impact performance measures (activity during workshop session);
2. Where to start to assess your current organizational data and business systems in light of these performance measures;
3. An introduction to a process for reviewing software and determining a system that will be most useful to the organization’s operations.
4. An overview of software options used to support performance management, demonstrate impact and help to strategically plan for improvements.
We are happy to have a conversation about where you are at - and where you want to go - with your performance management and nonprofit business systems.
2013 OVCN INNOVATION & ACTION! Conference
'If Demonstrating Impact Seems Boring, You're Doing it Wrong' facilitated by Andrew Taylor of Taylor Newberry Consulting Inc.
http://taylornewberry.ca/
#OVCNaction
Results-Based Accountability ™ is a performance management framework outlined by performance outcomes specialist Mark Friedman in “Trying Hard is Not Good Enough.” More than 600 of Vermont’s nonprofit and state government leaders have been trained to use RBA to answer these critical performance questions: How much are we doing? How well are we doing it? Is anyone better off? Learn how to promote the “culture of accountability” within your business, organization or coalition. Benchmarks for a Better Vermont offers this 90-minute RBA overview/refresher using examples from Vermont’s farm and food systems sector.
So you've learned the Results-Based Accountability framework. The next step is to build systems of accountability within the organization? This short course offers the "brass tacks" in building a data collection, presentation and analysis assembly-line with your staff. Michael Moser, from the Vermont State Data Center and Shelagh Cooley from Common Good Vermont provide examples, tools and concrete next steps that you can implement immediately. Watch the video here: http://www.cctv.org/watch-tv/programs/make-data-work-you#
Audience Engagement Tools - Lessons learned after 10,000 eventsJohn Pytel
Presentation given by John Pytel, CEO of audience engagement platform Conferences i/o at the 2019 IAEE Expo!Expo! event.
Topics discussed in this presentation:
• Data that shows how to engage your audience more effectively
• Lessons learned after 10,000 events
• The future of CPE audience engagement
• New tools and engagement techniques to make your job easier
About Conferences.io
Our customers spend 80% less time tracking attendance and calculating credit with Conferences.io The Virginia Society of CPAs reduced the time spent reconciling CPE attendance by 80% after transitioning to Conferences i/o. What previously took a combined 320 hours per year between seminars and conferences was reduced to only 70 hours.
Feedback & Surveys - How to use the Constant Contact Toolkit Part 2Frithjof Petscheleit
Take Marketing To the Next Level with the Constant Contact Toolkit
Finally, with a single login you can engage and grow your audience in all the places that matter: the inbox, mobile, social media, and the web. The Constant Contact Toolkit has beautiful, customizable templates to create your campaign fast. Integrated contact management and real-time reporting insights help you see results with each campaign.
This webinar series introduces all the awesome new Constant Contact tools. With one click you can sign up and take part in all free sessions.
Newsletters and Announcements
Surveys and Feedback
Event Promo & Registration
Deals and Promotions
Auto responders
Demonstrating the impact and value of your vcse organisation CANorfolk
Part of CAN's 2020 Annual VCSE conference. This interactive session is designed to help you understand how you can demonstrate the value of what your organisation does. Led by Jenny Potkins (NCVO) and Paul Webb (MAP & Centre for Youth Impact) this session introduced how you can articulate the difference your organisation makes, and some of the processes and tools you can use to measure that difference.
Improving and Demonstrating Impact for Youth Using Qualitative DataDetroitYDRC
This workshop provided an overview of how to use qualitative data for improving and demonstrating the impact of youth development programs. Tips for collecting, analyzing and using qualitative data are provided. Examples of creative ways to visualize qualitative data are also shared.
Addressing the Risks & Opportunities of Implementing an Outcomes Based Strategy Blackbaud Pacific
Presented by Brenda Dolieslager, Registered Psychologist & Outcome Measurement Consultant
In this webinar Brenda looks at the risks & opportunities that come with implementing an outcomes based strategy.
By watching this webinar you will:
• Learn what is and is not required to successfully adopt an outcomes based strategy.
• Understand how you are positioned to adopt an outcomes based strategy and what should be your next steps
• Assess the risks involved and learn how they can be mitigated
• Be armed with information to commence of further internal and external conversations around outcomes based strategies.
To view the full webinar please visit: https://www.blackbaud.com.au/notforprofit-events/webinars/past
VS Liv MSHQ 2022 - Measuring Up! How To Choose Agile Metrics - Dugan.pdfAngela Dugan
How many times have you been asked to deliver on metrics that did not make sense to you, that felt counterproductive to your or the team's effectiveness, or that were seemingly impossible to collect in a sane fashion? Oftentimes, I find that metrics being collected are ones that are easy to collect and report on but are not necessarily the ones that will help the team learn and improve.
When it comes to software delivery, lean and agile practices and methodologies have taken the lead. Metrics have lagged a bit and often rely on very waterfall-style milestones and phase-gates to determine a team's effectiveness. In the spirit of continuous improvement, this session will take a look at the measures we can and should collect from agile teams, why these metrics are relevant and interesting, and how we can use them to help our teams continuously improve.
Opendatabay - Open Data Marketplace.pptxOpendatabay
Opendatabay.com unlocks the power of data for everyone. Open Data Marketplace fosters a collaborative hub for data enthusiasts to explore, share, and contribute to a vast collection of datasets.
First ever open hub for data enthusiasts to collaborate and innovate. A platform to explore, share, and contribute to a vast collection of datasets. Through robust quality control and innovative technologies like blockchain verification, opendatabay ensures the authenticity and reliability of datasets, empowering users to make data-driven decisions with confidence. Leverage cutting-edge AI technologies to enhance the data exploration, analysis, and discovery experience.
From intelligent search and recommendations to automated data productisation and quotation, Opendatabay AI-driven features streamline the data workflow. Finding the data you need shouldn't be a complex. Opendatabay simplifies the data acquisition process with an intuitive interface and robust search tools. Effortlessly explore, discover, and access the data you need, allowing you to focus on extracting valuable insights. Opendatabay breaks new ground with a dedicated, AI-generated, synthetic datasets.
Leverage these privacy-preserving datasets for training and testing AI models without compromising sensitive information. Opendatabay prioritizes transparency by providing detailed metadata, provenance information, and usage guidelines for each dataset, ensuring users have a comprehensive understanding of the data they're working with. By leveraging a powerful combination of distributed ledger technology and rigorous third-party audits Opendatabay ensures the authenticity and reliability of every dataset. Security is at the core of Opendatabay. Marketplace implements stringent security measures, including encryption, access controls, and regular vulnerability assessments, to safeguard your data and protect your privacy.
As Europe's leading economic powerhouse and the fourth-largest hashtag#economy globally, Germany stands at the forefront of innovation and industrial might. Renowned for its precision engineering and high-tech sectors, Germany's economic structure is heavily supported by a robust service industry, accounting for approximately 68% of its GDP. This economic clout and strategic geopolitical stance position Germany as a focal point in the global cyber threat landscape.
In the face of escalating global tensions, particularly those emanating from geopolitical disputes with nations like hashtag#Russia and hashtag#China, hashtag#Germany has witnessed a significant uptick in targeted cyber operations. Our analysis indicates a marked increase in hashtag#cyberattack sophistication aimed at critical infrastructure and key industrial sectors. These attacks range from ransomware campaigns to hashtag#AdvancedPersistentThreats (hashtag#APTs), threatening national security and business integrity.
🔑 Key findings include:
🔍 Increased frequency and complexity of cyber threats.
🔍 Escalation of state-sponsored and criminally motivated cyber operations.
🔍 Active dark web exchanges of malicious tools and tactics.
Our comprehensive report delves into these challenges, using a blend of open-source and proprietary data collection techniques. By monitoring activity on critical networks and analyzing attack patterns, our team provides a detailed overview of the threats facing German entities.
This report aims to equip stakeholders across public and private sectors with the knowledge to enhance their defensive strategies, reduce exposure to cyber risks, and reinforce Germany's resilience against cyber threats.
Adjusting primitives for graph : SHORT REPORT / NOTESSubhajit Sahu
Graph algorithms, like PageRank Compressed Sparse Row (CSR) is an adjacency-list based graph representation that is
Multiply with different modes (map)
1. Performance of sequential execution based vs OpenMP based vector multiply.
2. Comparing various launch configs for CUDA based vector multiply.
Sum with different storage types (reduce)
1. Performance of vector element sum using float vs bfloat16 as the storage type.
Sum with different modes (reduce)
1. Performance of sequential execution based vs OpenMP based vector element sum.
2. Performance of memcpy vs in-place based CUDA based vector element sum.
3. Comparing various launch configs for CUDA based vector element sum (memcpy).
4. Comparing various launch configs for CUDA based vector element sum (in-place).
Sum with in-place strategies of CUDA mode (reduce)
1. Comparing various launch configs for CUDA based vector element sum (in-place).
3. I. WHY do we measure?
III. WHICH data do we collect?
IV. Q&A and Discussion
II. WHAT do we measure?
a. The Logic Model in theory…
b. and in practice
IV. HOW can we collect data?
V. HOW can we share our results?
16. Inputs Activities Outputs Outcomes Impact
What do we need to
implement the program?
Funding
Space
Curricula
Training
Participants
Staff
17. Inputs Activities Outputs Outcomes Impact
What activities need to be
offered to help us achieve
our intended results?
Tutoring
Curricula
Find Homes
Distribute Food
Provide Counseling
Education
18. Inputs Activities Outputs Outcomes Impact
How will we know that our
activities happened?
# of Hours
# of Participants
# Distributed
# Housed
# of Counseling
# Completed
# Contacted
19. Inputs Activities Outputs Outcomes Impact
What results do we expect?
Change in Knowledge
Change in Skills
Change in Attitude
Change in Behaviors
20. Inputs Activities Outputs Outcomes Impact
What do we ultimately
want to change?
Conditions in Life, such as:
Graduation
College Entry
Stable Housing
Earnings
Obesity
28. The systematic and
continual documentation
of key aspects of a program
in order to assess whether
the program is being
implemented as intended.
Program Monitoring
Quality
Improvement
Process
Monitoring
Fidelity of
Implementation
Performance
Management
Process
Evaluation
32. The systematic method
for collecting, analyzing,
and using information to
answer questions about a
program’s effectiveness.
Program Evaluation
Program
Outcomes
Impact
Evaluation
Outcome
Evaluation
66. Did you like the program?
1 2 3 4 5
Are you a hapy person?
Strongly Disagree disagree agree Strongly Agree
Is this your favorite place to be all the time or do you like being with
your parents instead?
Yes No
Did you like the snacks?
Yes No
Did you have fun when you were here and talking to your friends?
Strongly Disagree disagree agree Strongly Agree
Did you like the snacks?
Yes No
And on and on for 10 pages…
67.
68.
69.
70. Reporting (and collecting) data:
Who is your audience?
Will they read it?
Will they understand it?
How will they use it?