This presentation describes Patton's developmental evaluation paradigm, and how the Military Families Learning Network utilizes it for network-wide program evaluation.
STRATEGY + TECHNOLOGY A WINNING COMBINATION FOR EFFECTIVE LEARNING MEASUREMENTHuman Capital Media
How do you know if your learning program is really working?
Proper course tracking and sharing of learning data can help organizations develop best practices for how organizations measure the impact of learning. When historically, tools and systems haven’t made it easy to access and correlate data in order to measure, finding the right combination of strategy and technology can help optimize learning results to increase performance and impact business outcomes across the entire organization.
In this webinar you will learn about:
Best practices for measuring and optimizing learning programs.
Learning tools that improve workflow efficiency.
Expanding L&D value across the enterprise to drive results.
Create Your End User Adoption Strategy - Office 365 EditionErica Toelle
We all know that End User Adoption is an important area of focus in your SharePoint project.
In this session, we will take a closer look at the End User Adoption work stream on a SharePoint project, and the associated roles, responsibilities, and tasks for the project plan. We will also review case studies to demonstrate how these differ based on the size of the project and the specific needs of the organization.
You’ll walk away from this session with a tactical formula you can follow to create your end user adoption strategy and templates to support the process.
Right Message, Right Medium, Right Customer: Using Demographic Data to Find t...dpradel
The data exists today to tell you what models you should be marketing where, and more importantly, what the people buying those models want to hear from you and how they prefer to hear it. Join String Automotive and Experian Automotive to learn how you can access this rich demographic data and use it to craft highly tactical, hyper-targeting marketing campaigns to sell more cars, more efficiently.
Educational Data Mining in Program Evaluation: Lessons LearnedKerry Rice
AET 2016 Researchers present findings from a series of data mining studies, primarily examining data mining as part of an innovative triangulated approach in program evaluation. Findings suggest that is it possible to apply EDM techniques in online and blended learning classrooms to identify key variables important to the success of learners. Lessons learned will be shared as well as areas for improving data collection in learning management systems for meaningful analysis and visualization.
STRATEGY + TECHNOLOGY A WINNING COMBINATION FOR EFFECTIVE LEARNING MEASUREMENTHuman Capital Media
How do you know if your learning program is really working?
Proper course tracking and sharing of learning data can help organizations develop best practices for how organizations measure the impact of learning. When historically, tools and systems haven’t made it easy to access and correlate data in order to measure, finding the right combination of strategy and technology can help optimize learning results to increase performance and impact business outcomes across the entire organization.
In this webinar you will learn about:
Best practices for measuring and optimizing learning programs.
Learning tools that improve workflow efficiency.
Expanding L&D value across the enterprise to drive results.
Create Your End User Adoption Strategy - Office 365 EditionErica Toelle
We all know that End User Adoption is an important area of focus in your SharePoint project.
In this session, we will take a closer look at the End User Adoption work stream on a SharePoint project, and the associated roles, responsibilities, and tasks for the project plan. We will also review case studies to demonstrate how these differ based on the size of the project and the specific needs of the organization.
You’ll walk away from this session with a tactical formula you can follow to create your end user adoption strategy and templates to support the process.
Right Message, Right Medium, Right Customer: Using Demographic Data to Find t...dpradel
The data exists today to tell you what models you should be marketing where, and more importantly, what the people buying those models want to hear from you and how they prefer to hear it. Join String Automotive and Experian Automotive to learn how you can access this rich demographic data and use it to craft highly tactical, hyper-targeting marketing campaigns to sell more cars, more efficiently.
Educational Data Mining in Program Evaluation: Lessons LearnedKerry Rice
AET 2016 Researchers present findings from a series of data mining studies, primarily examining data mining as part of an innovative triangulated approach in program evaluation. Findings suggest that is it possible to apply EDM techniques in online and blended learning classrooms to identify key variables important to the success of learners. Lessons learned will be shared as well as areas for improving data collection in learning management systems for meaningful analysis and visualization.
TERMS OF DEMOGRAPHIC DATA SOURCES
Demography : study of statistical description and analysis of human population.
Population : summation of all the organism of the same group in a particular geographical area.
Population census : a complete population count at a point in time within a particular area.
Vital registration : registration on live Births, Deaths, Fetal deaths, Marriages, and Divorces.
Sample Survey: representative portion of the population .
DEMOGRAPHIC DATA
Demographic data is the study of the population its static and dynamic aspects.
Static aspect (age, sex, race etc.)
Dynamic aspect (fertility, morality, migration)
#SPW13 - Educational Data Mining: Empowering young innovators - María Begoña ...Brussels, Belgium
The 13th Science Projects Workshop in the Future Classroom Lab, organised by Scientix, in collaboration with the STEM Alliance and OPENSKIMR took place in Brussels in October 2016, from Friday 14 (starting time: 19:30) to Sunday 16 (end ~14h).
Hypothesis is usually considered as the principal instrument in research and quality control. Its main function is to suggest new experiments and observations. In fact, many experiments are carried out with the deliberate object of testing hypothesis. Decision makers often face situations wherein they are interested in testing hypothesis on the basis of available information and then take decisions on the basis of such testing. In Six –Sigma methodology, hypothesis testing is a tool of substance and used in analysis phase of the six sigma project so that improvement can be done in right direction
Each month, join us as we highlight and discuss hot topics ranging from the future of higher education to wearable technology, best productivity hacks and secrets to hiring top talent. Upload your SlideShares, and share your expertise with the world!
Not sure what to share on SlideShare?
SlideShares that inform, inspire and educate attract the most views. Beyond that, ideas for what you can upload are limitless. We’ve selected a few popular examples to get your creative juices flowing.
5 Barrier to Effective Employee Training Programs and How to Crush Them | Web...BizLibrary
Even in organizations where training and development priorities are closely aligned with business goals there’s plenty of room for improvement.
So, the question is what barriers do we need to be overcome if training and development departments are to increase the value they add to workforce performance and productivity, and remain relevant?
In this webinar we'll outline the 5 barriers to effective training and development programs and best practices for overcoming those barriers.
1. Efficiency - leveraging technology
2. Status Quo - inertia and risk aversion.
3. Convenience - throwing training at the problem and hoping it works.
4. Training Mindset - moving from a focus on training to performance.
5. Manager Support - managers are the single most important factor in improving employee performance.
www.bizlibrary.com/webinars
TERMS OF DEMOGRAPHIC DATA SOURCES
Demography : study of statistical description and analysis of human population.
Population : summation of all the organism of the same group in a particular geographical area.
Population census : a complete population count at a point in time within a particular area.
Vital registration : registration on live Births, Deaths, Fetal deaths, Marriages, and Divorces.
Sample Survey: representative portion of the population .
DEMOGRAPHIC DATA
Demographic data is the study of the population its static and dynamic aspects.
Static aspect (age, sex, race etc.)
Dynamic aspect (fertility, morality, migration)
#SPW13 - Educational Data Mining: Empowering young innovators - María Begoña ...Brussels, Belgium
The 13th Science Projects Workshop in the Future Classroom Lab, organised by Scientix, in collaboration with the STEM Alliance and OPENSKIMR took place in Brussels in October 2016, from Friday 14 (starting time: 19:30) to Sunday 16 (end ~14h).
Hypothesis is usually considered as the principal instrument in research and quality control. Its main function is to suggest new experiments and observations. In fact, many experiments are carried out with the deliberate object of testing hypothesis. Decision makers often face situations wherein they are interested in testing hypothesis on the basis of available information and then take decisions on the basis of such testing. In Six –Sigma methodology, hypothesis testing is a tool of substance and used in analysis phase of the six sigma project so that improvement can be done in right direction
Each month, join us as we highlight and discuss hot topics ranging from the future of higher education to wearable technology, best productivity hacks and secrets to hiring top talent. Upload your SlideShares, and share your expertise with the world!
Not sure what to share on SlideShare?
SlideShares that inform, inspire and educate attract the most views. Beyond that, ideas for what you can upload are limitless. We’ve selected a few popular examples to get your creative juices flowing.
5 Barrier to Effective Employee Training Programs and How to Crush Them | Web...BizLibrary
Even in organizations where training and development priorities are closely aligned with business goals there’s plenty of room for improvement.
So, the question is what barriers do we need to be overcome if training and development departments are to increase the value they add to workforce performance and productivity, and remain relevant?
In this webinar we'll outline the 5 barriers to effective training and development programs and best practices for overcoming those barriers.
1. Efficiency - leveraging technology
2. Status Quo - inertia and risk aversion.
3. Convenience - throwing training at the problem and hoping it works.
4. Training Mindset - moving from a focus on training to performance.
5. Manager Support - managers are the single most important factor in improving employee performance.
www.bizlibrary.com/webinars
Organizing for Results - How to Build an Effective Marketing Function in an I...Jeffrey Rich
Higher ed has survived for hundreds of years without giving an ounce of thought to marketing since competition was stagnant due to the inability of new entrants into their markets. For-profit and online education changed all that. Now, competition is almost limitless, and many schools will go out of business if they don't take marketing seriously. Infusing marketing strategy and talent to differentiate your brand, academic programs and having a daily focus on inquiry generation are now requirements. Here are some tips on how to organize and what to focus on.
What do 10,000 development professionals know about improving project outcomes?Chris Proulx
PMDPro is the emerging standard for managing projects in the international development and humanitarian sectors. Learn more about the credential, and how it is an appropriate, affordable, accessible, and actionable solution for NGO and public sector development projects.
Measuring Performance: Linking Money To Mission4Good.org
In the mission-oriented nonprofit world, it can sometimes be difficult to get your team to focus on the relationship between money coming in and the organization’s ability to effectively deliver programs and services to its constituents. While top-line measures are exciting they can also be distracting and counterproductive if they are not properly accounted for and effectively presented. Peeling back the covers on your finance organization can help your team, executive director, staff and board members truly understand your financial situation and see how finances directly impact programs. Financial operations reviews deliver insight. Reviews help organizations move beyond budgets and expenses into staffing, policies and processes. You will learn…
• The 5 “must ask” questions
• The 3 reports you should study
• The secrets of financial management
• How to determine financial viability of new programs and link money with mission
5 Barriers to Effective Employee Training Programs - Webinar 08.21.14BizLibrary
In this webinar we'll outline the 5 barriers to effective training and development programs and best practices for overcoming those barriers.
www.bizlibrary.com/webinars
Forging Successful Learning Centers: Critical Considerations and Evidence-Bas...Lisa D'Adamo-Weinstein
Forging Successful Learning Centers: Critical Considerations and Evidence-Based Practices for New LC Directors
Presented at NCLCA 2021 Annual Conference
Stepping into an LC leadership role and feeling overwhelmed about how to focus your efforts? Join members of the NCLCA Past Presidents Council for an in-depth exploration of evidence-based best practices that will help you improve the infrastructure and operations of your center.
Breakout groups will allow you to begin forging concrete plans in critical areas, including LC programs and services, utilization of online tools and technology, assessment and evaluation, professional development, and budgets and revenue generation.
Co-presented with NCLCA Past President's Council members Geoff Bailey, Lindy Coleman, Lisa D'Adamo-Weinstein, Jenny Haley, and Laura Sanders as part of the National College Learning Center Association (NCLCA) 47th Annual Conference. Birmingham, AL and online.
Nonprofit Turn-Arounds: The Road to Recovery When Your Organization Is in the...Bloomerang
http://bloomerang.co/resources/webinars/
Rebecca Davis, PhD, CFRE will give an overview of the turn-around process for nonprofits, discussing the dual bottom-line for nonprofits, the challenges of re-engaging community stakeholders, and the difficult road to recovery when your organization has been in the red.
Presentation from NCVO's Annual Conference 2011 on The Value of Intrafrastructure, a three-year England-wide initiative to support infrastructure organisations in plan, assess, improve and communicate their impact.
THINKING ABOUT CSR IN PRACTICE: thoughts, tools and examples – Lecture to McG...Wayne Dunn
This lecture was delivered by Prof. Wayne Dunn to students and faculty at McGill University’s Executive Education Program on Corporate Social Responsibility: CREATING VALUE THROUGH COLLABORATIVE SUSTAINABLE DEVELOPMENT. The program, which was organized by McGill’s Institute for the Study of International Development, brought together 40+ mid-career professionals from around the world for an intensive one week program on Corporate Social Responsibility. The lecture, which was based on Wayne Dunn’s 25+ year’s of work in the field, brought together theory and practice to provide students with a set of practical tools and frameworks.
In this interactive discussion, CEO of DevelopIntelligence, Kelby Zorgdrager, will share strategies for measuring the impact of training from a variety of large organizations across the country: Salesforce, Eventbrite, VMware and Autodesk. In this session, you will discover and engage with current training program methodologies in order to determine your company’s return on investment on learning.
Outline of unique leadership development process for organizations - not training, not consulting, but how we coach/facilitate people in achieving specific measurable personal and professional goals
In this webinar, David Mallon, Head of Research for Bersin by Deloitte, Deloitte Consulting LLP, will summarize the latest research on how forward-looking organizations are adapting their approaches to development in order to better meet the needs of modern learners. David will be joined by Carol Leaman, President & CEO, Axonify, who will share real-world examples of these trends and practical advice for other organizations facing similar challenges.
Get the recording here: know.axonify.com/modern-learner-webinar
Similar to Developmental Evaluation on the Ground (20)
This session provides a comprehensive overview of the latest updates to the Uniform Administrative Requirements, Cost Principles, and Audit Requirements for Federal Awards (commonly known as the Uniform Guidance) outlined in the 2 CFR 200.
With a focus on the 2024 revisions issued by the Office of Management and Budget (OMB), participants will gain insight into the key changes affecting federal grant recipients. The session will delve into critical regulatory updates, providing attendees with the knowledge and tools necessary to navigate and comply with the evolving landscape of federal grant management.
Learning Objectives:
- Understand the rationale behind the 2024 updates to the Uniform Guidance outlined in 2 CFR 200, and their implications for federal grant recipients.
- Identify the key changes and revisions introduced by the Office of Management and Budget (OMB) in the 2024 edition of 2 CFR 200.
- Gain proficiency in applying the updated regulations to ensure compliance with federal grant requirements and avoid potential audit findings.
- Develop strategies for effectively implementing the new guidelines within the grant management processes of their respective organizations, fostering efficiency and accountability in federal grant administration.
Jennifer Schaus and Associates hosts a complimentary webinar series on The FAR in 2024. Join the webinars on Wednesdays and Fridays at noon, eastern.
Recordings are on YouTube and the company website.
https://www.youtube.com/@jenniferschaus/videos
Presentation by Jared Jageler, David Adler, Noelia Duchovny, and Evan Herrnstadt, analysts in CBO’s Microeconomic Studies and Health Analysis Divisions, at the Association of Environmental and Resource Economists Summer Conference.
Monitoring Health for the SDGs - Global Health Statistics 2024 - WHOChristina Parmionova
The 2024 World Health Statistics edition reviews more than 50 health-related indicators from the Sustainable Development Goals and WHO’s Thirteenth General Programme of Work. It also highlights the findings from the Global health estimates 2021, notably the impact of the COVID-19 pandemic on life expectancy and healthy life expectancy.
ZGB - The Role of Generative AI in Government transformation.pdfSaeed Al Dhaheri
This keynote was presented during the the 7th edition of the UAE Hackathon 2024. It highlights the role of AI and Generative AI in addressing government transformation to achieve zero government bureaucracy
Jennifer Schaus and Associates hosts a complimentary webinar series on The FAR in 2024. Join the webinars on Wednesdays and Fridays at noon, eastern.
Recordings are on YouTube and the company website.
https://www.youtube.com/@jenniferschaus/videos
Donate to charity during this holiday seasonSERUDS INDIA
For people who have money and are philanthropic, there are infinite opportunities to gift a needy person or child a Merry Christmas. Even if you are living on a shoestring budget, you will be surprised at how much you can do.
Donate Us
https://serudsindia.org/how-to-donate-to-charity-during-this-holiday-season/
#charityforchildren, #donateforchildren, #donateclothesforchildren, #donatebooksforchildren, #donatetoysforchildren, #sponsorforchildren, #sponsorclothesforchildren, #sponsorbooksforchildren, #sponsortoysforchildren, #seruds, #kurnool
Preliminary findings _OECD field visits to ten regions in the TSI EU mining r...OECDregions
Preliminary findings from OECD field visits for the project: Enhancing EU Mining Regional Ecosystems to Support the Green Transition and Secure Mineral Raw Materials Supply.
Jennifer Schaus and Associates hosts a complimentary webinar series on The FAR in 2024. Join the webinars on Wednesdays and Fridays at noon, eastern.
Recordings are on YouTube and the company website.
https://www.youtube.com/@jenniferschaus/videos
2024: The FAR - Federal Acquisition Regulations, Part 38
Developmental Evaluation on the Ground
1. Developmental Evaluation on the
Ground
Brigitte Scott, Ph.D.
Evaluation and Research Specialist
Military Families Learning Network
2. Military Families Learning Network
Personal Finance
Child Care
Family Advocacy
Network Literacy
Personal Finance
Family Development
Network Literacy
Transition Support
Community Capacity
Building
Nutrition & Wellness
Early Intervention
Special Needs
Life Span Special
Needs
Military Caregiving
3.
4. MQP: DE Defined
“DE supports innovation development to guide
adaptation to emergent and dynamic realities in
complex environments.” —MQP, p. 1
Goal: support project, program, product, and/or
organizational development with timely
feedback
5. MQP: Core DE Question
What is getting developed and what are the
implications of what gets developed?
Complexity theory of change:
Bring people together who are knowledgeable and
committed and they will self-organize, take
action, and work together to create movement,
innovation, and change.
8. Focus on developmentMQP
• NOT the same as continuous improvement
(formative eval or accountability)
– Quality improvement is helping programs meet
standards that have been set.
Development is when people are changing what
they are doing, and the very nature of the
standards are also changing. Standards are adapting
to changing conditions.
10. Complex and dynamic environmentMQP
Complex environment = lack of
central control
What to do to solve problems is
uncertain and there can be conflict
about how to proceed
14. Developmental EvaluatorMQP
• Works collaboratively with innovators to
conceptualize, design, and test new approaches
in a long-term, on-going process of adaptation,
intentional change, and development.
• Ask eval questions
• Apply eval logic
• Gather and report eval data
15. Developmental EvaluatorMQP
Primary Functions:
Elucidate the innovation and adaptation
processes
Track implications and results
Facilitate data-based decision-making
16. “There are no best practices. ‘Best’ is
entirely context-free.”
—MQP, AEA 2014
17. Evaluation for MFLN
• Ongoing development
• Innovation in learning
– Internal practice
– External product
21. Cooperative Agreement
“An opportunity to provide responsive and timely
educational programming through being actively nimble,
flexible, innovative, and creative in true partnership with our
funders as they identify organizational priorities and ask us
to engage in the construction of appropriate and necessary
deliverables that meet the on-time needs of the target
audience. The MFLN and funding partners are seen as true
and honest equals in expertise, providing valued and
accepted feedback bi-directionally. This often results in
surpassing expectations for success in delivery.”
—Kyle Kostelecky, National Program Director, MFLN
26. DE for MFLN: What It Looks Like
Continuous adaptation
27. DE for MFLN: What It Looks Like
Continuous adaptation
Social, political, economic, technological, and demographic
patterns
28. DE for MFLN: What It Looks Like
Continuous adaptation
Social, political, economic, technological, and demographic
patterns
Emergent developments/needs of our target audiences
29. DE for MFLN: What It Looks Like
Continuous adaptation
Social, political, economic, technological, and demographic
patterns
Emergent developments/needs of our target audiences
Cooperative agreement environment
30. DE for MFLN: What It Looks Like
Continuous adaptation
Social, political, economic, technological, and demographic
patterns
Emergent developments/needs of our target audiences
Cooperative agreement environment
eXtension
31. DE for MFLN: What It Looks Like
Continuous adaptation
Social, political, economic, technological, and demographic
patterns
Emergent developments/needs of our target audiences
Cooperative agreement environment
eXtension
DoD
32. DE for MFLN: What It Looks Like
(not this. . . .)
33. DE for MFLN: What It Looks Like
(but this. . . .)
35. DE for MFLN: What It Looks Like
Inductive
Chaotic and messy
36. DE for MFLN: What It Looks Like
Inductive
Chaotic and messy
Embrace forks in the road
37. DE for MFLN: What It Looks Like
Inductive
Chaotic and messy
Embrace forks in the road
Developmental moments
Track what’s going on
Understand what’s going on
Adapt to what’s going on
38. DE for MFLN: What It Looks Like
Everyone is an
evaluator
39. Complex development situations are ones in
Inspired by Jeff Conklin,
cognexus.org
which this…
Time
39
Michael Quinn Patton
AEA eStudy webinar 2014
44. DE Successes for MFLN
• Very responsive to our partners’ programmatic
requests and innovations
45. DE Successes for MFLN
• Very responsive to our partners’ programmatic
requests and innovations
• Doubling in size in 2015 with competitive award
– New CAs and new opportunities for innovation in
programming
46. DE Successes for MFLN
• Very responsive to our partners’ programmatic
requests and innovations
• Doubling in size in 2015 with competitive award
– New CAs and new opportunities for innovation in
programming
• Programming
47. DE Successes for MFLN
• Very responsive to our partners’ programmatic
requests and innovations
• Doubling in size in 2015 with competitive award
– New CAs and new opportunities for innovation in
programming
• Programming
• “Walk the talk”: model use of social media,
collaborative learning, personal learning
networks
49. DE Challenges for MFLN
• Innovation is challenging, slow, and . . . contentious
50. DE Challenges for MFLN
• Innovation is challenging, slow, and . . . contentious
• Moving from program improvement mode to redesign
based on evaluation findings
51. DE Challenges for MFLN
• Innovation is challenging, slow, and . . . contentious
• Moving from program improvement mode to redesign
based on evaluation findings
• IRB challenges
52. DE Challenges for MFLN
• Innovation is challenging, slow, and . . . contentious
• Moving from program improvement mode to redesign
based on evaluation findings
• IRB challenges
• “In your face” evaluation
53. DE Challenges for MFLN
• Innovation is challenging, slow, and . . . contentious
• Moving from program improvement mode to redesign
based on evaluation findings
• IRB challenges
• “In your face” evaluation
• Daily, evaluative thinking
54. DE Challenges for MFLN
• Innovation is challenging, slow, and . . . contentious
• Moving from program improvement mode to redesign
based on evaluation findings
• IRB challenges
• “In your face” evaluation
• Daily, evaluative thinking
• Platform delivery limitations
55. DE Challenges for MFLN
• Innovation is challenging, slow, and . . . contentious
• Moving from program improvement mode to redesign
based on evaluation findings
• IRB challenges
• “In your face” evaluation
• Daily, evaluative thinking
• Platform delivery limitations
• Military culture
56. DE Challenges for MFLN
• Innovation is challenging, slow, and . . . contentious
• Moving from program improvement mode to redesign
based on evaluation findings
• IRB challenges
• “In your face” evaluation
• Daily, evaluative thinking
• Platform delivery limitations
• Military culture
• Doubling in size in January 2015
57.
58. DE Features for MFLN
• Reports
• Leadership
• Reflective discussion groups
• Transparency
• Collaboration
60. Patton, Michael Quinn. (2014).
“Intermediate Developmental Evaluation.”
AEA eStudy Webinar.
Patton, Michael Quinn. (2011). Developmental
Evaluation: Applying Complexity Concepts to
Enhance Innovation and Use. Guilford: New
York.
61. Brigitte Scott
@4ed_eval
brigit2@vt.edu
www.blogs.extension.org/militaryfamilies
Icons made by Freepik:
http://www.flaticon.com. Flaticon is
licensed by Creative Commons 3.0.
Editor's Notes
DE is not ongoing formative eval
DE supports and documents development
DE is not ongoing formative eval
DE supports and documents development
Complex environment = lack of central control; what to do to solve problems is uncertain and there can be conflict about how to proceed
Focus on development, which is not the same as continuous improvement (formative eval) or accountability
Improvement vs. development
Quality improvement is helping programs meet standards that have been set.
Developmental evaluation is when people are changing what they are doing and the very nature of the standards are also changing. Standards are adapting to changing conditions
DE: standards themselves and what it means to meet them and how to talk about them and how to apply them is being developed and changed in relationship to a changing world; very different from having a predetermined set of quality criteria that a program has to meet.
Focus on development, which is not the same as continuous improvement (formative eval) or accountability
Improvement vs. development
Quality improvement is helping programs meet standards that have been set.
Developmental evaluation is when people are changing what they are doing and the very nature of the standards are also changing. Standards are adapting to changing conditions
DE: standards themselves and what it means to meet them and how to talk about them and how to apply them is being developed and changed in relationship to a changing world; very different from having a predetermined set of quality criteria that a program has to meet.
DE becomes part of intervention
Developmental evaluation in the context of MFLN is an evaluation paradigm that facilitates program development and innovation through continuous adaptation to the dynamic and changing MFLN cooperative agreement environment.
Our programming doesn’t intend to necessarily become a fixed, standardized model. But it does help us identify effective principles that inform ongoing development
Cooperative agreement:
NIFA, DoD, and UIUC
Different than traditional grant, which is when you articulate your research goals and deliverables before receiving your funding; instead, we receive funding based on a loosely defined set of criteria (plans of work)
An award similar to a grant, but I which the sponsor’s staff may be actively involved in proposal preparation, and anticipates having substantial involvement in research activities once the award has been made.
Can you tell me if the PF VLE was a result of that responsiveness? I'm looking for examples of how our plans of work can change during the year based on DoD conversations and request. I can't remember, but I thought the VLE was an example of that--not planned for 2014 but instead requested after the year started. Yes, this is a great example. DoD thought is was coming at some point - we had talked about it in the past. They felt that it was time, as it met a quickly evolving need from their end. They asked and stressed how important it would be to accomplish this at this time and we went back to our PF team and they were open, excited, and offered a plan for how they could do it within their budget (not adding but rearranging and deleting other deliverables). After PF’s new plan was discussed with DoD, all were happy to move forward. Another example of how this cooperative agreement has worked from our end (we instigated a suggested change) was when I began to hear more and more about how military caregiving was an increasing concern from those Anne and I talked with when we were at conferences (especially military conferences). We were not hearing this from MC&FP and so I did some homework on my end, began to have these conversations with Betsy and Eddy, and suggested that we should consider expanding in this area. They took this idea back to their offices and partners and then came back to us with an approval to move forward. We did this within our 2010 award at the end when we needed to re-capture funds and construct a re-budget. We had about $40K to play with and conducted a 5 month pilot. MC&FP liked it that it was fully funded for 2012 (and of course has been ever since with great success). So it goes both ways, they ask and we deliver - we suggest and they agree. It’s been great. There are other examples if you need more, just let me know.
eXtension: emergent and dynamic organization both tied to and simultaneously oriented away from Cooperative Extension.
Monthly reports
Quarterly reports
Annual reports
Weekly leadership team meetings
Webinar evaluation reports
Quarter webinar reports
Social Media Specialists meetings
Net Lit support meetings
.
We
Real time innovation is challenging
--We’re very good at innovating and request of our partners; not as nimble for us to innovate on our own (still top down in that way);
--PIs bought out for only a small portion of time; innovation and change is slow, may not have everyone’s buy-in
Challenge: We were really in program implementation/improvement mode for the first four years. How do you introduce the possibility that you are no longer in program improvement mode but in fact that program needs to be redesigned? How do you prepare the people you’re working with for the possibility that you are no longer in program improvement mode but rather that you need to begin adapting and changing to findings of formative evaluation?
Difficult to make changes quickly to eval and research
--DoD looking for all kinds of information, but IRB is a challenge (18 month IRB process for working with contractors; hard to be responsive and provide real-time feedback)
Evaluation is constantly “in your face”
--Getting everyone to think evaluatively is challenging
Platform limitations for reaching people on bases;
--Limits innovation
Military culture (private, formal);
Doubling in size in 2015—lots of work to get done and most days are about just getting through the day
Real time innovation is challenging
--We’re very good at innovating and request of our partners; not as nimble for us to innovate on our own (still top down in that way);
--PIs bought out for only a small portion of time; innovation and change is slow, may not have everyone’s buy-in
Challenge: We were really in program implementation/improvement mode for the first four years. How do you introduce the possibility that you are no longer in program improvement mode but in fact that program needs to be redesigned? How do you prepare the people you’re working with for the possibility that you are no longer in program improvement mode but rather that you need to begin adapting and changing to findings of formative evaluation?
Difficult to make changes quickly to eval and research
--DoD looking for all kinds of information, but IRB is a challenge (18 month IRB process for working with contractors; hard to be responsive and provide real-time feedback)
Evaluation is constantly “in your face”
--Getting everyone to think evaluatively is challenging
Platform limitations for reaching people on bases;
--Limits innovation
Military culture (private, formal);
Doubling in size in 2015—lots of work to get done and most days are about just getting through the day
Real time innovation is challenging
--We’re very good at innovating and request of our partners; not as nimble for us to innovate on our own (still top down in that way);
--PIs bought out for only a small portion of time; innovation and change is slow, may not have everyone’s buy-in
Challenge: We were really in program implementation/improvement mode for the first four years. How do you introduce the possibility that you are no longer in program improvement mode but in fact that program needs to be redesigned? How do you prepare the people you’re working with for the possibility that you are no longer in program improvement mode but rather that you need to begin adapting and changing to findings of formative evaluation?
Difficult to make changes quickly to eval and research
--DoD looking for all kinds of information, but IRB is a challenge (18 month IRB process for working with contractors; hard to be responsive and provide real-time feedback)
Evaluation is constantly “in your face”
--Getting everyone to think evaluatively is challenging
Platform limitations for reaching people on bases;
--Limits innovation
Military culture (private, formal);
Doubling in size in 2015—lots of work to get done and most days are about just getting through the day
Real time innovation is challenging
--We’re very good at innovating and request of our partners; not as nimble for us to innovate on our own (still top down in that way);
--PIs bought out for only a small portion of time; innovation and change is slow, may not have everyone’s buy-in
Challenge: We were really in program implementation/improvement mode for the first four years. How do you introduce the possibility that you are no longer in program improvement mode but in fact that program needs to be redesigned? How do you prepare the people you’re working with for the possibility that you are no longer in program improvement mode but rather that you need to begin adapting and changing to findings of formative evaluation?
Difficult to make changes quickly to eval and research
--DoD looking for all kinds of information, but IRB is a challenge (18 month IRB process for working with contractors; hard to be responsive and provide real-time feedback)
Evaluation is constantly “in your face”
--Getting everyone to think evaluatively is challenging
Platform limitations for reaching people on bases;
--Limits innovation
Military culture (private, formal);
Doubling in size in 2015—lots of work to get done and most days are about just getting through the day
Real time innovation is challenging
--We’re very good at innovating and request of our partners; not as nimble for us to innovate on our own (still top down in that way);
--PIs bought out for only a small portion of time; innovation and change is slow, may not have everyone’s buy-in
Challenge: We were really in program implementation/improvement mode for the first four years. How do you introduce the possibility that you are no longer in program improvement mode but in fact that program needs to be redesigned? How do you prepare the people you’re working with for the possibility that you are no longer in program improvement mode but rather that you need to begin adapting and changing to findings of formative evaluation?
Difficult to make changes quickly to eval and research
--DoD looking for all kinds of information, but IRB is a challenge (18 month IRB process for working with contractors; hard to be responsive and provide real-time feedback)
Evaluation is constantly “in your face”
--Getting everyone to think evaluatively is challenging
Platform limitations for reaching people on bases;
--Limits innovation
Military culture (private, formal);
Doubling in size in 2015—lots of work to get done and most days are about just getting through the day
Real time innovation is challenging
--We’re very good at innovating and request of our partners; not as nimble for us to innovate on our own (still top down in that way);
--PIs bought out for only a small portion of time; innovation and change is slow, may not have everyone’s buy-in
Challenge: We were really in program implementation/improvement mode for the first four years. How do you introduce the possibility that you are no longer in program improvement mode but in fact that program needs to be redesigned? How do you prepare the people you’re working with for the possibility that you are no longer in program improvement mode but rather that you need to begin adapting and changing to findings of formative evaluation?
Difficult to make changes quickly to eval and research
--DoD looking for all kinds of information, but IRB is a challenge (18 month IRB process for working with contractors; hard to be responsive and provide real-time feedback)
Evaluation is constantly “in your face”
--Getting everyone to think evaluatively is challenging
Platform limitations for reaching people on bases;
--Limits innovation
Military culture (private, formal);
Doubling in size in 2015—lots of work to get done and most days are about just getting through the day
Real time innovation is challenging
--We’re very good at innovating and request of our partners; not as nimble for us to innovate on our own (still top down in that way);
--PIs bought out for only a small portion of time; innovation and change is slow, may not have everyone’s buy-in
Challenge: We were really in program implementation/improvement mode for the first four years. How do you introduce the possibility that you are no longer in program improvement mode but in fact that program needs to be redesigned? How do you prepare the people you’re working with for the possibility that you are no longer in program improvement mode but rather that you need to begin adapting and changing to findings of formative evaluation?
Difficult to make changes quickly to eval and research
--DoD looking for all kinds of information, but IRB is a challenge (18 month IRB process for working with contractors; hard to be responsive and provide real-time feedback)
Evaluation is constantly “in your face”
--Getting everyone to think evaluatively is challenging
Platform limitations for reaching people on bases;
--Limits innovation
Military culture (private, formal);
Doubling in size in 2015—lots of work to get done and most days are about just getting through the day
Real time innovation is challenging
--We’re very good at innovating and request of our partners; not as nimble for us to innovate on our own (still top down in that way);
--PIs bought out for only a small portion of time; innovation and change is slow, may not have everyone’s buy-in
Challenge: We were really in program implementation/improvement mode for the first four years. How do you introduce the possibility that you are no longer in program improvement mode but in fact that program needs to be redesigned? How do you prepare the people you’re working with for the possibility that you are no longer in program improvement mode but rather that you need to begin adapting and changing to findings of formative evaluation?
Difficult to make changes quickly to eval and research
--DoD looking for all kinds of information, but IRB is a challenge (18 month IRB process for working with contractors; hard to be responsive and provide real-time feedback)
Evaluation is constantly “in your face”
--Getting everyone to think evaluatively is challenging
Platform limitations for reaching people on bases;
--Limits innovation
Military culture (private, formal);
Doubling in size in 2015—lots of work to get done and most days are about just getting through the day