This document summarizes research on the effectiveness of different mentoring program practices. It presents a framework for evaluating evidence on program practices with different levels and strengths of evidence. Research suggests practices like structured mentor-youth activities, ongoing mentor training, clear meeting frequency expectations, and using mentors from helping professions can positively impact youth outcomes. However, more research is still needed to strengthen the evidence base on specific program practices. The document encourages practitioners to consider this research and ways to incorporate findings while also improving program evaluation.
Issue 2: Effectiveness of Mentoring Program Practices.
This series was developed by MENTOR and translates the latest mentoring research into tangible strategies for mentoring practitioners. Research In Action (RIA) makes the best available research accessible and relevant to the mentoring field.
CSU Extension, Engagement and the Logic modelSteven Newman
Presentation delivered to graduate class Principles of Extension.
Much of the material generated in this lecture were from the extension, logic model, scholarship of engagement were taken from the University of Wisconsin-Extension, Program Development and Evaluation program.
http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html
Innovation Network's own workbook on evaluation planning. Can be used alone or in conjunction with the Evaluation Plan Builder at the Point K Learning Center.
Jennifer Kuschner, Program Development and Evaluation Specialist, UW-Extension
Kerry Zaleski, Monitoring and Evaluation Project Coordinator, UW-Extension
This interactive session provided participants with an overview of what a logic model is and how to use one for planning, implementation, evaluation or communicating about co-curricular community service activities. The session also provided an opportunity to work in teams to create participant’s own logic model.
EDUC 8103-6: A7: Program Proposal, Section 5—Program Evaluationeckchela
This is Walden University course (EDUC8103-6) Section 5: Program Evaluation. It is formatted in APA, has been graded (A), and includes references. Most universities submit higher-education assignments to turnitin; so, remember to paraphrase. Enjoy your discovery!
Issue 2: Effectiveness of Mentoring Program Practices.
This series was developed by MENTOR and translates the latest mentoring research into tangible strategies for mentoring practitioners. Research In Action (RIA) makes the best available research accessible and relevant to the mentoring field.
CSU Extension, Engagement and the Logic modelSteven Newman
Presentation delivered to graduate class Principles of Extension.
Much of the material generated in this lecture were from the extension, logic model, scholarship of engagement were taken from the University of Wisconsin-Extension, Program Development and Evaluation program.
http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html
Innovation Network's own workbook on evaluation planning. Can be used alone or in conjunction with the Evaluation Plan Builder at the Point K Learning Center.
Jennifer Kuschner, Program Development and Evaluation Specialist, UW-Extension
Kerry Zaleski, Monitoring and Evaluation Project Coordinator, UW-Extension
This interactive session provided participants with an overview of what a logic model is and how to use one for planning, implementation, evaluation or communicating about co-curricular community service activities. The session also provided an opportunity to work in teams to create participant’s own logic model.
EDUC 8103-6: A7: Program Proposal, Section 5—Program Evaluationeckchela
This is Walden University course (EDUC8103-6) Section 5: Program Evaluation. It is formatted in APA, has been graded (A), and includes references. Most universities submit higher-education assignments to turnitin; so, remember to paraphrase. Enjoy your discovery!
2. grantseeking creating a program logic modelRebecca White
Grants for beginners. Module 2 of grant seeking series. Covers how to develop a program logic model for grant development. Basic program logic models include highlighting the situation and priorities; development of overall program goal; determining program outcomes, outputs and inputs; identifying any assumptions and external factors that are in play; and developing an program evaluation plan.
A textbook must provide, first and foremost, information to assist the reader in better understanding the topic. Second, it ought to provide the information in a way that can be easily accessed and digested, and it needs to be credible. Textbooks
that have gone through multiple editions continue to improve as a result of reviewers’ comments and readers’ feedback, and this one is no exception. Looking back over the efforts associated with this Fifth Edition, the old wedding custom of “something old, something new, something borrowed, something blue” comes to
mind. We have built upon the solid foundation of previous editions, but then added “something new.” It almost goes without saying that we have “borrowed” from others in that we both cite and quote examples of program evaluation studies
from the literature. “Something blue” . . . well, we’re not sure about that. Those who have used the Fourth Edition might be interested in knowing what has changed in this new edition. Based on reviewers’ comments we have:
• Created a new chapter to explain sampling.
• Incorporated new material on designing questionnaires.
• Overhauled the chapter on qualitative evaluation. It is now “Qualitative and Mixed Methods in Evaluation.”
• Reworked the “Formative and Process Evaluation” chapter with expanded coverage on developing logic models.
• Added new studies and references; new Internet sources of information.
• Included new examples of measurement instruments (scales) with a macro
focus.
• Inserted new checklists and guides (such as ways to minimize and monitor for potential fidelity problems—Chapter 13).
• Revised the chapter “Writing Evaluation Proposals, Reports, and Journal Articles” to give it less of an academic slant. There’s new material on writing
executive summaries and considerations in planning and writing evaluation
reports for agencies.
• Deleted the chapter on Goal Attainment Scalin
National Trends Affecting Community Engagement and PlanningBonner Foundation
As part of our strategic planning with Maryville College, we will discuss how some current national trends affecting higher education, nonprofits, and community engagement are affecting the local landscape and direction.
Opportunities for local people to hold NGO’s to account for their actions have improved in recent years, but there has been little evidence to suggest that they can actually influence the quality and results of aid itself - until now.
This report provides concrete evidence of the way accountability mechanisms improve the value for money, effectiveness, relevance, and sustainability of humanitarian and development projects.
Fevatools is a web-based toolkit to jump-start your efforts to conduct formative evaluation of student learning and course design. Come learn more about how SDSU faculty are using freely available, web-based tools to gather data that informs iterative refinement of their course designs.
2. grantseeking creating a program logic modelRebecca White
Grants for beginners. Module 2 of grant seeking series. Covers how to develop a program logic model for grant development. Basic program logic models include highlighting the situation and priorities; development of overall program goal; determining program outcomes, outputs and inputs; identifying any assumptions and external factors that are in play; and developing an program evaluation plan.
A textbook must provide, first and foremost, information to assist the reader in better understanding the topic. Second, it ought to provide the information in a way that can be easily accessed and digested, and it needs to be credible. Textbooks
that have gone through multiple editions continue to improve as a result of reviewers’ comments and readers’ feedback, and this one is no exception. Looking back over the efforts associated with this Fifth Edition, the old wedding custom of “something old, something new, something borrowed, something blue” comes to
mind. We have built upon the solid foundation of previous editions, but then added “something new.” It almost goes without saying that we have “borrowed” from others in that we both cite and quote examples of program evaluation studies
from the literature. “Something blue” . . . well, we’re not sure about that. Those who have used the Fourth Edition might be interested in knowing what has changed in this new edition. Based on reviewers’ comments we have:
• Created a new chapter to explain sampling.
• Incorporated new material on designing questionnaires.
• Overhauled the chapter on qualitative evaluation. It is now “Qualitative and Mixed Methods in Evaluation.”
• Reworked the “Formative and Process Evaluation” chapter with expanded coverage on developing logic models.
• Added new studies and references; new Internet sources of information.
• Included new examples of measurement instruments (scales) with a macro
focus.
• Inserted new checklists and guides (such as ways to minimize and monitor for potential fidelity problems—Chapter 13).
• Revised the chapter “Writing Evaluation Proposals, Reports, and Journal Articles” to give it less of an academic slant. There’s new material on writing
executive summaries and considerations in planning and writing evaluation
reports for agencies.
• Deleted the chapter on Goal Attainment Scalin
National Trends Affecting Community Engagement and PlanningBonner Foundation
As part of our strategic planning with Maryville College, we will discuss how some current national trends affecting higher education, nonprofits, and community engagement are affecting the local landscape and direction.
Opportunities for local people to hold NGO’s to account for their actions have improved in recent years, but there has been little evidence to suggest that they can actually influence the quality and results of aid itself - until now.
This report provides concrete evidence of the way accountability mechanisms improve the value for money, effectiveness, relevance, and sustainability of humanitarian and development projects.
Fevatools is a web-based toolkit to jump-start your efforts to conduct formative evaluation of student learning and course design. Come learn more about how SDSU faculty are using freely available, web-based tools to gather data that informs iterative refinement of their course designs.
Building a Successful Mentoring Program: Mentor Support, Recognition, & Reten...Mentor Michigan
Join us for this webinar to learn about standards 7 and 8, focusing on mentor support, recognition, and retention, as well as match closure procedures. In this webinar, we will examine how to support and provide recognition to mentors and other volunteers for their hard work and we will discuss the importance of using a formal match closure procedure. We will identify a variety of methods of providing ongoing mentor support, training, supervision, and recognition as well as explore the key aspects of a match closure procedure.
To download the Quality Program Standards and Checklist, please visit: http://www.michigan.gov/mentormichigan/0,1607,7-193--123108--,00.html
Fundamentals for Impacting Student SuccessJim Black
Topics include influencing student retention behavior, understanding student attrition factors, leveraging student retention data and research, retention best practices, and managing change to impact campus-wide engagement in retention.
There have been signs the job market is rebounding, which means you’re going to have to start bringing your A-game again. But perhaps your organization’s financial situation is not quite keeping up with the national job reports. Learn the best ways to welcome entry-level hires and attract new ones. This presentation will give you and your organization the tools needed to start building a better, more cohesive work environment.
5 Ways to Build a Better Leadership Development Program | Webinar 06.09.15BizLibrary
Leadership remains the top human capital concern. Poor leadership practices costs companies millions of dollars each year by negatively impacting employee retention, customer satisfaction and overall employee productivity.
In this webinar we'll provide four leadership development best practices that meet challenges faced by today's leaders and offer you tools for implementing leadership development initiatives in your organization.
What you'll learn:
- Importance of Leadership Development
Best Practices including:
- Strong executive involvement
- Use of tailored leadership competencies
- Alignment with the business strategy
- A “leaders at all levels” approach
www.bizlibrary.com
Why youth mentoring as an intervention strategy?
Why be systematic/rigorous about developing (and improving) mentoring intervention strategies and evaluating their effectiveness?
What is “best practice” when developing mentoring intervention strategies?
What are the most rigorous and informative methods for evaluating youth mentoring intervention strategies?
March 2, 2011 - Ongoing Training for Mentors, part of monthly Quality In Action webinar series hosted by the Mentoring Partnership of Minnesota.
Standard 5 of the Elements of Effective Practice for Mentoring™, Third Edition outlines benchmarks for providing quality monitoring and support for matches. One of those benchmarks is that programs provide "one or more opportunities per year for post-match mentor training." Join this webinar to learn and share ongoing training resources, ideas for training topics, and strategies for getting mentors to show up. Amy Cannata from the National Mentoring Center will talk about their new FREE resource, Talking it Through: Communication Skills for Mentors, an interactive website that uses video stories and other tools to enhance ongoing mentor training.
SOCW 6311 wk 11 discussion 1 peer responses
Respond
to
at least two
colleagues’ by doing the following:
Respond to at least two colleagues by offering critiques of their analyses. Identify strengths in their analyses and strategies for presenting evaluation results to others.
Identify ways your colleagues might improve their presentations.
Identify potential needs or questions of the audience that they may not have considered.
Provide an additional strategy for overcoming the obstacles or challenges in communicating the content of the evaluation reports.
Name first and references after every person
Instructor wants lay out like this:
Respond to at least two colleagues ( 2 peers posts are provided) by doing all of the following:
Identify strengths of your colleagues’ analyses and areas in which the analyses could be improved.
Your response
Address his or her evaluation of the efficacy and applicability of the evidence-based practice,
Your response
[Evaluate] his or her identification of factors that could support or hinder the implementation of the evidence-based practice,
Your response
And [evaluate] his or her solution for mitigating those factors.
Your response
Offer additional insight to your colleagues by either identifying additional factors that may support or limit implementation of the evidence-based practice or an alternative solution for mitigating one of the limitations that your colleagues identified.
Your response
References
Your response
Peer 1: McKenna Bull
RE: Katie Otte Initial Post-Discussion 1 - Week 11
COLLAPSE
Top of Form
Identify strengths in their analyses and strategies for presenting evaluation results to others.
You provided an insightful analysis of this particular process evaluation, and it seems that you were able to design a comprehensive presentation guideline. I agree with your tactic to break the presentation up into categories, and the categories you have selected seem to address the major components of the program, the evaluation itself, and the findings of said evaluation. You also provided a great analysis and summary of the PATHS program. The purpose of the program is clear, and the overarching purpose of the evaluation was made clear in your synopsis as well.
Identify ways your colleagues might improve their presentations.
You addressed outcome measures very well, however, there may have been some lacking information in regards to overall evaluation methods as a whole. Addressing factors such as who was collecting the data, how they were trained, how their training or standing could limit potential bias, and similar information. This may be an important piece of information that could help to provide audience members with a better understanding of the evaluation processes as a whole.
Identify potential needs or questions of the audience that they may not have considered.
As mentioned by Law and Shek (2011), this program was designed and facilitated in Hong Kong, Chi.
Issue 3: Program Staff in Youth Mentoring Programs: Qualifications, Training, and Retention.
This series was developed by MENTOR and translates the latest mentoring research into tangible strategies for mentoring practitioners. Research In Action (RIA) makes the best available research accessible and relevant to the mentoring field.
Hosted by Mentoring Partnership of Minnesota on October 30, 2012.
The Mentoring Best Practices Research Project, funded by the Office of Juvenile Justice and Delinquency Prevention (OJJDP), is being conducted in collaboration with Global Youth Justice and the National Partnership for Juvenile Services.
Presented October 18, 2012 - Part of 2012 Collaborative Mentoring Webinar Series
Education Northwest/National Mentoring Center, Friends For Youth, Indiana Mentoring Partnership, Kansas Mentors, Mentoring Partnerships of Minnesota and of Southwest Pennsylvania, Mentor Michigan, Mobius Mentors, Oregon Mentors and other partners are working together in 2012 to deliver this free monthly webinar series for mentoring professionals.
For updates about upcoming webinars, subscribe to the Chronicle of Evidence Based Mentoring forum: http://chronicle.umbmentoring.org/category/forum/ and at MENTOR/The National Mentoring Partnership.
January 19, 2012 - 1/12 in 2012 Collaborative Mentoring Webinar Series
Featured panelists:
David DuBois, Ph.D., University of Illinois at Chicago &
Tom Keller, Ph.D., Portland State University
Part of monthly Quality In Action webinar series hosted by the Mentoring Partnership of Minnesota. Why Youth Mentoring Relationships End with Dr. Renee Spencer, September 2011.
2. Research In Action: Overview of Series Last year, MENTOR released the National Agenda for Action: How to Close America’s Mentoring Gap . Representing the collective wisdom of the mentoring field, the Agenda articulates five key strategies and action items necessary to move the field forward and truly close the mentoring gap. In an effort to address one of these critical strategies—elevating the role of research—MENTOR created the Research and Policy Council , an advisory group composed of the nation’s leading mentoring researchers, policymakers, and practitioners. In September 2006, MENTOR convened the first meeting of the Research and Policy Council with the goal of increasing the connection and exchange of ideas among practitioners, policymakers, and researchers to strengthen the practice of youth mentoring. The Research in Action series is the first product to evolve from the work of the Council—taking current mentoring research and translating it into useful, user-friendly materials for mentoring practitioners.
3. Research In Action Issues: Issue 1: Mentoring: A Key Resource for Promoting PYD Issue 2: Effectiveness of Mentoring Program Practices Issue 3: Program Staff in Youth Mentoring Programs Issue 4: Fostering Close and Effective Relationships Issue 5: Why Youth Mentoring Relationships End Issue 6: School-Based Mentoring Issue 7: Cross-Age Peer Mentoring Issue 8: Mentoring Across Generations: Engaging Age 50+ Adults Issue 9: Youth Mentoring: Do Race and Ethnicity Really Matter? Issue 10: Mentoring: A Promising Intervention for Children of Prisoners
13. Mentoring Best Practices: Circles of Evidence Best Practices * Includes evidence obtained from sources other than formal research, such as client satisfaction surveys, program participant outcomes, and community demographic trends. Research * Professional Expertise & Experience *Local Resources & Needs *Client/Stakeholder Preferences & Beliefs
14.
15.
16. Levels of evidence: Level 1 – isolate mentoring program practice of interest (POI) Level 2 –less precise comparisons in which the POI is not isolated Level 3 – qualitative research