Yonix presents: It’s all about stakeholder communicationyonix
It’s all about stakeholder communication: The role of business analysis in determining the success of projects.
As business analysts, it's our job to encourage collaboration and communication between technical and non-technical stakeholders. We represent the requirements of business stakeholders and translate this to our technical team members, and vice versa. We are the in essence, conduits of information.
A large majority of project failures are attributed to the requirements and analysis phases. Are we really doing what is required to meet business and stakeholder expectations?
During this session, Jody Bullen, CEO of Yonix will present some findings from a recent New Zealand survey, highlighting common business analysis problems faced in software development.
Checklist Communication Strategy DevelopmentEwen Le Borgne
This presentation was given during a workshop on strategic communication for the Water and Sanitation Forum in Ethiopia (hosted under CRDA) and is based on a checklist of strategic questions developed by IRC Water and Sanitation Centre (www.irc.nl) to help develop a communication strategy. The workshop was facilited by me and Livia Iotti for the RiPPLE project and by Simret Yasabu for WaterAid Ethiopia.
Communication Strategy - Workshop to Obtain Stakeholder InputJohn Mauremootoo
Generic version of a PowerPoint presentation used in a workshop to obtain stakeholder inputs into a project communication strategy. This presentation can be used as a template when formulating a project or programme communication strategy and work plan.
Change Management And Communications for Complex IT ProjectsJanaLee
Explanation of what Small Planet Works includes as part of "Org Change and Communications" for a large, complex IT project. Also an explanation of why these "people issues" are a critical success factor for IT projects.
Yonix presents: It’s all about stakeholder communicationyonix
It’s all about stakeholder communication: The role of business analysis in determining the success of projects.
As business analysts, it's our job to encourage collaboration and communication between technical and non-technical stakeholders. We represent the requirements of business stakeholders and translate this to our technical team members, and vice versa. We are the in essence, conduits of information.
A large majority of project failures are attributed to the requirements and analysis phases. Are we really doing what is required to meet business and stakeholder expectations?
During this session, Jody Bullen, CEO of Yonix will present some findings from a recent New Zealand survey, highlighting common business analysis problems faced in software development.
Checklist Communication Strategy DevelopmentEwen Le Borgne
This presentation was given during a workshop on strategic communication for the Water and Sanitation Forum in Ethiopia (hosted under CRDA) and is based on a checklist of strategic questions developed by IRC Water and Sanitation Centre (www.irc.nl) to help develop a communication strategy. The workshop was facilited by me and Livia Iotti for the RiPPLE project and by Simret Yasabu for WaterAid Ethiopia.
Communication Strategy - Workshop to Obtain Stakeholder InputJohn Mauremootoo
Generic version of a PowerPoint presentation used in a workshop to obtain stakeholder inputs into a project communication strategy. This presentation can be used as a template when formulating a project or programme communication strategy and work plan.
Change Management And Communications for Complex IT ProjectsJanaLee
Explanation of what Small Planet Works includes as part of "Org Change and Communications" for a large, complex IT project. Also an explanation of why these "people issues" are a critical success factor for IT projects.
Dr Bardini and Cassandra Jessee from YouthPower hosted a workshop on Measuring Positive Youth Development (PYD) at the 8th AfrEA International Conference in Kampala, Ghana.
After this lecture participants will:
understand the basics of project management and know the role of project manager,
understand principles of Project Cycle Management (PCM),
know how to use Logical Framework Approach (LFA) and key terms and definitions for proposals and reports,
understand key elements of project monitoring and evaluation (M&E) and its cycle within the project or program, and
learn about and use Active Implementation Frameworks (AIF).
Show Me the Outcomes!
Evaluating and Proving Your Impact on the Community
Learn how to:
1. Understand how to build a successful outcomes plan for your nonprofit organization
2. Increase your funding by proving your program success to your funders
3. Make informed decisions about future programming and resource allocation
You will also receive an inside view of the Apricot Outcomes Palette™, a dynamic outcomes reporting tool
Presented by:
Kathryn Engelhardt-Cronk
Founder/CEO/President
Community TechKnowledge, Inc.
Dr Bardini and Cassandra Jessee from YouthPower hosted a workshop on Measuring Positive Youth Development (PYD) at the 8th AfrEA International Conference in Kampala, Ghana.
Global Poverty Action Fund (GPAF) Funding SeminarNIDOS
How to make a good application to DFID's Global Poverty Action Fund (GPAF). Training from the Network of International Development Organisations in Scotland (NIDOS). www.nidos.org.uk
This presentation describes Patton's developmental evaluation paradigm, and how the Military Families Learning Network utilizes it for network-wide program evaluation.
Presentation from NCVO's Annual Conference 2011 on The Value of Intrafrastructure, a three-year England-wide initiative to support infrastructure organisations in plan, assess, improve and communicate their impact.
Making Government User-Centered: Managing UCD projects to promote changeEmma Rose, PhD
Following the successful redesign of a large informational web site, user-centered design (UCD) became the impetus to create a more user-centered organization. We will share strategies for introducing and managing successful UCD projects, identifying and mitigating project risks, and integrating user-centered design into government processes.
Dr Bardini and Cassandra Jessee from YouthPower hosted a workshop on Measuring Positive Youth Development (PYD) at the 8th AfrEA International Conference in Kampala, Ghana.
After this lecture participants will:
understand the basics of project management and know the role of project manager,
understand principles of Project Cycle Management (PCM),
know how to use Logical Framework Approach (LFA) and key terms and definitions for proposals and reports,
understand key elements of project monitoring and evaluation (M&E) and its cycle within the project or program, and
learn about and use Active Implementation Frameworks (AIF).
Show Me the Outcomes!
Evaluating and Proving Your Impact on the Community
Learn how to:
1. Understand how to build a successful outcomes plan for your nonprofit organization
2. Increase your funding by proving your program success to your funders
3. Make informed decisions about future programming and resource allocation
You will also receive an inside view of the Apricot Outcomes Palette™, a dynamic outcomes reporting tool
Presented by:
Kathryn Engelhardt-Cronk
Founder/CEO/President
Community TechKnowledge, Inc.
Dr Bardini and Cassandra Jessee from YouthPower hosted a workshop on Measuring Positive Youth Development (PYD) at the 8th AfrEA International Conference in Kampala, Ghana.
Global Poverty Action Fund (GPAF) Funding SeminarNIDOS
How to make a good application to DFID's Global Poverty Action Fund (GPAF). Training from the Network of International Development Organisations in Scotland (NIDOS). www.nidos.org.uk
This presentation describes Patton's developmental evaluation paradigm, and how the Military Families Learning Network utilizes it for network-wide program evaluation.
Presentation from NCVO's Annual Conference 2011 on The Value of Intrafrastructure, a three-year England-wide initiative to support infrastructure organisations in plan, assess, improve and communicate their impact.
Making Government User-Centered: Managing UCD projects to promote changeEmma Rose, PhD
Following the successful redesign of a large informational web site, user-centered design (UCD) became the impetus to create a more user-centered organization. We will share strategies for introducing and managing successful UCD projects, identifying and mitigating project risks, and integrating user-centered design into government processes.
On Thursday 16th October 2014, John Chapman and Andrew Gray presented at the APM Project Management in Practice Event, where the subject area was an Introduction to Programme Management.
Theirs was an interactive session where John provided the theoretical side of programme management, whilst Andrew explained how this worked using a real life example from the UK MOD where a Programme Management approach was adopted using the Managing Successful Programmes (MSP) framework.
The Programme Lifecycle gave a structure to the presentation covering seven areas
1. What is a programme?
2. Why do a programme?
3. What makes up a programme?
4. How do we run a programme?
5. Who is in the programme?
6. When does a programme end?
7. What challenges are faced?
It was important to show how Programme Management called upon the specialisms from the other Specific Interest Groups.
An example of this relates to Benefits Management. Early on in the programme the questions to be asked, and answered, include:
1. Is there a vision of a change future?
2. Is this a shared single vision?
3. Is it in line with what is needed?
4. What are the benefits to be gained?
5. Who benefits, what do they benefit, how much benefit, when do they benefit?
Andrew commented that an important area to consider was the area of stakeholder management. With a high profile programme, there are many diverse stakeholder groups and interfaces including
• An external advisory group
• Local representatives and committees
• Regulators & policy holders
• UK & Scottish governments
• Press coverage
• Wide ranging public consultations
Consultation and communication (two way) would then provide inputs and influences to the decision making process within the Programme.
At the end of the presentation Andrew noted the lessons learned (so far) on the adoption of a programme management approach as:
A Programme Management approach is not for everything
- Split change element of the objectives from long-term business as usual
Bring clarity & focus
- Projects need to know how they fit into ‘big change picture’
Get senior commitment
- Have the approach endorsed by the Programme Board
Co-ordinate stakeholder engagement
- Communications must be co-ordinated and consistent across the projects
Scale the management investment that is needed
- Do not swamp with bureaucracy
Efficient pooling of resources
- A small programme team benefits from pooling common central activities
Cope with geographically dispersed team
- Programme Management approach is the glue to hold things together
Leading Transformation Programs in Large / Global OrganizationsKaali Dass PMP, PhD.
Research shows average about 70% of the transformation programs fail.
This presentation focuses on need for transformation in organizations and propose a model to implement transformation programs successfully in large / global organizations.
Organizational Change Management presented by Hany Sewilam AbdelHamid, Leading Change and Making a Stick where you can improve your internal and external environment and change the process of MD.
In this interactive discussion, CEO of DevelopIntelligence, Kelby Zorgdrager, will share strategies for measuring the impact of training from a variety of large organizations across the country: Salesforce, Eventbrite, VMware and Autodesk. In this session, you will discover and engage with current training program methodologies in order to determine your company’s return on investment on learning.
Safalta Digital marketing institute in Noida, provide complete applications that encompass a huge range of virtual advertising and marketing additives, which includes search engine optimization, virtual communication advertising, pay-per-click on marketing, content material advertising, internet analytics, and greater. These university courses are designed for students who possess a comprehensive understanding of virtual marketing strategies and attributes.Safalta Digital Marketing Institute in Noida is a first choice for young individuals or students who are looking to start their careers in the field of digital advertising. The institute gives specialized courses designed and certification.
for beginners, providing thorough training in areas such as SEO, digital communication marketing, and PPC training in Noida. After finishing the program, students receive the certifications recognised by top different universitie, setting a strong foundation for a successful career in digital marketing.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
Normal Labour/ Stages of Labour/ Mechanism of LabourWasim Ak
Normal labor is also termed spontaneous labor, defined as the natural physiological process through which the fetus, placenta, and membranes are expelled from the uterus through the birth canal at term (37 to 42 weeks
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
Biological screening of herbal drugs: Introduction and Need for
Phyto-Pharmacological Screening, New Strategies for evaluating
Natural Products, In vitro evaluation techniques for Antioxidants, Antimicrobial and Anticancer drugs. In vivo evaluation techniques
for Anti-inflammatory, Antiulcer, Anticancer, Wound healing, Antidiabetic, Hepatoprotective, Cardio protective, Diuretics and
Antifertility, Toxicity studies as per OECD guidelines
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
This slide is special for master students (MIBS & MIFB) in UUM. Also useful for readers who are interested in the topic of contemporary Islamic banking.
2. Developmental Evaluation
“DE supports innovation development to
guide adaptation to emergent and dynamic
realities in complex environments.” —
MQP
Goal: support project, program, and
organizational development with timely
feedback
3. Core DE Question
What is getting developed and what are
the implications of what gets
developed?
Complexity theory of change:
Bring people together who are
knowledgeable and committed and they
will self-organize, take action, and work
together to create movement, innovation,
and change.
6. Focus on development
Development is when people are
changing what they are doing, and the very
nature of the standards are also changing.
Standards are adapting to changing
conditions.
NOT the same as continuous improvement
(formative eval or accountability)
– Quality improvement is helping programs
meet standards that have been set.
8. Rapid and continual
feedback
Ask evaluative questions
Apply evaluative logic
Gather and report evaluative data
Evaluation becomes part
of
9. Developmental Evaluator
Works collaboratively with innovators
to conceptualize, design, and test new
approaches
in a long-term, ongoing process
of adaptation, intentional change, and
development.
10. Evaluation for MFLN
• Ongoing
development
• Innovation in
learning
– Internal
practice
– External
product
• Reflection and
action
12. MFLN: Focus on development
Programming has been established…
– Identify guiding principles that inform ongoing
development
– Establishes track record, reliability, and
reputation
... BUT is not necessarily fixed
– Innovate, innovate, innovate
– Internal adaptability to meet changing external
standards and conditions
14. Cooperative Agreement
“An opportunity to provide responsive and
timely educational programming through
being actively nimble, flexible, innovative,
and creative in true partnership with our
funders as they identify organizational
priorities and ask us to engage in the
construction of appropriate and
necessary deliverables that meet the on-
time needs of the target audience. The
MFLN and funding partners are seen as true
and honest equals in expertise, providing
valued and accepted feedback bi-
directionally. This often results in
16. Internal evaluator
Member of leadership team
Elucidates innovation and adaptation
processes
Tracks implications and results
Facilitates data-based decision-making
17. DE for MFLN: What It Looks
Like
Continuous
adaptation
18. DE for MFLN: What It Looks
Like
Continuous
adaptationSocial, political, economic, technological, and
demographic patterns
19. DE for MFLN: What It Looks
Like
Continuous
adaptationSocial, political, economic, technological, and
demographic patterns
Emergent developments/needs of our target
audiences
20. DE for MFLN: What It Looks
Like
Continuous
adaptationSocial, political, economic, technological, and
demographic patterns
Emergent developments/needs of our target
audiences
Cooperative agreement environment
21. DE for MFLN: What It Looks
Like
Continuous
adaptationSocial, political, economic, technological, and
demographic patterns
Emergent developments/needs of our target
audiences
Cooperative agreement environment
eXtension
22. DE for MFLN: What It Looks
Like
Continuous
adaptationSocial, political, economic, technological, and
demographic patterns
Emergent developments/needs of our target
audiences
Cooperative agreement environment
eXtension
DoD
23. DE for MFLN: What It Looks
Like
(not this. . . .)
31. DE for MFLN: What It Looks
Like
Inductive
Chaotic and messy
32. DE for MFLN: What It Looks
Like
Inductive
Chaotic and messy
Embrace forks in the road
33. DE for MFLN: What It Looks
Like
Inductive
Chaotic and messy
Embrace forks in the road
Developmental moments
Track what’s going on
Understand what’s going on
Adapt to what’s going on
34. DE for MFLN: What It Looks
Like
Everyone is an
evaluator
36. DE Successes for MFLN
Very responsive to our partners’
programmatic requests and innovations
37. DE Successes for MFLN
Very responsive to our partners’
programmatic requests and innovations
You. Here. Now.
38. DE Successes for MFLN
Very responsive to our partners’
programmatic requests and innovations
You. Here. Now.
Programming
39. DE Successes for MFLN
Very responsive to our partners’
programmatic requests and innovations
You. Here. Now.
Programming
“Walk the talk”: model use of social media,
collaborative learning, personal learning
networks
40. DE Challenges for MFLN
Innovation is challenging, slow, and can be
contentious
41. DE Challenges for MFLN
Innovation is challenging, slow, and can be
contentious
Moving from program improvement mode to
redesign based on evaluation findings
42. DE Challenges for MFLN
Innovation is challenging, slow, and can be
contentious
Moving from program improvement mode to
redesign based on evaluation findings
IRB challenges
43. DE Challenges for MFLN
Innovation is challenging, slow, and can be
contentious
Moving from program improvement mode to
redesign based on evaluation findings
IRB challenges
Daily, evaluative thinking
44. DE Challenges for MFLN
Innovation is challenging, slow, and can be
contentious
Moving from program improvement mode to
redesign based on evaluation findings
IRB challenges
Daily, evaluative thinking
Platform delivery limitations
45. DE Challenges for MFLN
Innovation is challenging, slow, and can be
contentious
Moving from program improvement mode to
redesign based on evaluation findings
IRB challenges
Daily, evaluative thinking
Platform delivery limitations
Military culture
46. DE Challenges for MFLN
Innovation is challenging, slow, and can be
contentious
Moving from program improvement mode to
redesign based on evaluation findings
IRB challenges
Daily, evaluative thinking
Platform delivery limitations
Military culture
Scaling up
47. DE Features for MFLN: Nodal
Evaluations
Webinars
Social media
Focus groups
Deliverables/plans of work
• Why as much as what
Documentation
• Change, innovation, decision-making
48. DE Features for MFLN:
Reporting
From Concentration Areas
• PIs
– Monthly via Google forms: narrative updates,
pubs, presentations, challenges
• Social Media Specialists
– Monthly via Google forms: blogs, AaE, CEUs
– Webinar evaluation reports
49. DE Features for MFLN:
Reporting
To CAs:
• Monthly social media from Sprout Social
• Quarterly webinar data reports
Internal/External:
• Monthly (internal/DoD)
• Quarterly (NIFA)
• Quarterly webinar (internal/DoD)
• Annual (internal/D0D/NIFA)
50. DE Features for MFLN: Reflective
Discussion
CAs:
• Monthly social media specialists meetings
– NetLit, processes, evaluation,
communications
Leadership:
• Weekly meetings
PIs:
• Monthly meetings
51. DE Features for MFLN: Learning
and Working Paradigms
Leadership
Reflection
Transparency
Collaboration
Action
Innovation
Editor's Notes
DE is not ongoing formative eval
DE supports and documents development
DE is not ongoing formative eval
DE supports and documents development
Focus on development, which is not the same as continuous improvement (formative eval) or accountability
Improvement vs. development
Quality improvement is helping programs meet standards that have been set.
Developmental evaluation is when people are changing what they are doing and the very nature of the standards are also changing. Standards are adapting to changing conditions
DE: standards themselves and what it means to meet them and how to talk about them and how to apply them is being developed and changed in relationship to a changing world; very different from having a predetermined set of quality criteria that a program has to meet.
Focus on development, which is not the same as continuous improvement (formative eval) or accountability
Improvement vs. development
Quality improvement is helping programs meet standards that have been set.
Developmental evaluation is when people are changing what they are doing and the very nature of the standards are also changing. Standards are adapting to changing conditions
DE: standards themselves and what it means to meet them and how to talk about them and how to apply them is being developed and changed in relationship to a changing world; very different from having a predetermined set of quality criteria that a program has to meet.
Focus on development, which is not the same as continuous improvement (formative eval) or accountability
Improvement vs. development
Quality improvement is helping programs meet standards that have been set.
Developmental evaluation is when people are changing what they are doing and the very nature of the standards are also changing. Standards are adapting to changing conditions
DE: standards themselves and what it means to meet them and how to talk about them and how to apply them is being developed and changed in relationship to a changing world; very different from having a predetermined set of quality criteria that a program has to meet.
Cooperative agreement:
NIFA, DoD, and UIUC
Different than traditional grant, which is when you articulate your research goals and deliverables before receiving your funding; instead, we receive funding based on a loosely defined set of criteria (plans of work)
An award similar to a grant, but I which the sponsor’s staff may be actively involved in proposal preparation, and anticipates having substantial involvement in research activities once the award has been made.
Can you tell me if the PF VLE was a result of that responsiveness? I'm looking for examples of how our plans of work can change during the year based on DoD conversations and request. I can't remember, but I thought the VLE was an example of that--not planned for 2014 but instead requested after the year started. Yes, this is a great example. DoD thought is was coming at some point - we had talked about it in the past. They felt that it was time, as it met a quickly evolving need from their end. They asked and stressed how important it would be to accomplish this at this time and we went back to our PF team and they were open, excited, and offered a plan for how they could do it within their budget (not adding but rearranging and deleting other deliverables). After PF’s new plan was discussed with DoD, all were happy to move forward. Another example of how this cooperative agreement has worked from our end (we instigated a suggested change) was when I began to hear more and more about how military caregiving was an increasing concern from those Anne and I talked with when we were at conferences (especially military conferences). We were not hearing this from MC&FP and so I did some homework on my end, began to have these conversations with Betsy and Eddy, and suggested that we should consider expanding in this area. They took this idea back to their offices and partners and then came back to us with an approval to move forward. We did this within our 2010 award at the end when we needed to re-capture funds and construct a re-budget. We had about $40K to play with and conducted a 5 month pilot. MC&FP liked it that it was fully funded for 2012 (and of course has been ever since with great success). So it goes both ways, they ask and we deliver - we suggest and they agree. It’s been great. There are other examples if you need more, just let me know.
eXtension: emergent and dynamic organization both tied to and simultaneously oriented away from Cooperative Extension.
.
We
Real time innovation is challenging
--We’re very good at innovating and request of our partners; not as nimble for us to innovate on our own (still top down in that way);
--PIs bought out for only a small portion of time; innovation and change is slow, may not have everyone’s buy-in
Challenge: We were really in program implementation/improvement mode for the first four years. How do you introduce the possibility that you are no longer in program improvement mode but in fact that program needs to be redesigned? How do you prepare the people you’re working with for the possibility that you are no longer in program improvement mode but rather that you need to begin adapting and changing to findings of formative evaluation?
Difficult to make changes quickly to eval and research
--DoD looking for all kinds of information, but IRB is a challenge (18 month IRB process for working with contractors; hard to be responsive and provide real-time feedback)
Evaluation is constantly “in your face”
--Getting everyone to think evaluatively is challenging
Platform limitations for reaching people on bases;
--Limits innovation
Military culture (private, formal);
Doubling in size in 2015—lots of work to get done and most days are about just getting through the day
Real time innovation is challenging
--We’re very good at innovating and request of our partners; not as nimble for us to innovate on our own (still top down in that way);
--PIs bought out for only a small portion of time; innovation and change is slow, may not have everyone’s buy-in
Challenge: We were really in program implementation/improvement mode for the first four years. How do you introduce the possibility that you are no longer in program improvement mode but in fact that program needs to be redesigned? How do you prepare the people you’re working with for the possibility that you are no longer in program improvement mode but rather that you need to begin adapting and changing to findings of formative evaluation?
Difficult to make changes quickly to eval and research
--DoD looking for all kinds of information, but IRB is a challenge (18 month IRB process for working with contractors; hard to be responsive and provide real-time feedback)
Evaluation is constantly “in your face”
--Getting everyone to think evaluatively is challenging
Platform limitations for reaching people on bases;
--Limits innovation
Military culture (private, formal);
Doubling in size in 2015—lots of work to get done and most days are about just getting through the day
Real time innovation is challenging
--We’re very good at innovating and request of our partners; not as nimble for us to innovate on our own (still top down in that way);
--PIs bought out for only a small portion of time; innovation and change is slow, may not have everyone’s buy-in
Challenge: We were really in program implementation/improvement mode for the first four years. How do you introduce the possibility that you are no longer in program improvement mode but in fact that program needs to be redesigned? How do you prepare the people you’re working with for the possibility that you are no longer in program improvement mode but rather that you need to begin adapting and changing to findings of formative evaluation?
Difficult to make changes quickly to eval and research
--DoD looking for all kinds of information, but IRB is a challenge (18 month IRB process for working with contractors; hard to be responsive and provide real-time feedback)
Evaluation is constantly “in your face”
--Getting everyone to think evaluatively is challenging
Platform limitations for reaching people on bases;
--Limits innovation
Military culture (private, formal);
Doubling in size in 2015—lots of work to get done and most days are about just getting through the day
Real time innovation is challenging
--We’re very good at innovating and request of our partners; not as nimble for us to innovate on our own (still top down in that way);
--PIs bought out for only a small portion of time; innovation and change is slow, may not have everyone’s buy-in
Challenge: We were really in program implementation/improvement mode for the first four years. How do you introduce the possibility that you are no longer in program improvement mode but in fact that program needs to be redesigned? How do you prepare the people you’re working with for the possibility that you are no longer in program improvement mode but rather that you need to begin adapting and changing to findings of formative evaluation?
Difficult to make changes quickly to eval and research
--DoD looking for all kinds of information, but IRB is a challenge (18 month IRB process for working with contractors; hard to be responsive and provide real-time feedback)
Evaluation is constantly “in your face”
--Getting everyone to think evaluatively is challenging
Platform limitations for reaching people on bases;
--Limits innovation
Military culture (private, formal);
Doubling in size in 2015—lots of work to get done and most days are about just getting through the day
Real time innovation is challenging
--We’re very good at innovating and request of our partners; not as nimble for us to innovate on our own (still top down in that way);
--PIs bought out for only a small portion of time; innovation and change is slow, may not have everyone’s buy-in
Challenge: We were really in program implementation/improvement mode for the first four years. How do you introduce the possibility that you are no longer in program improvement mode but in fact that program needs to be redesigned? How do you prepare the people you’re working with for the possibility that you are no longer in program improvement mode but rather that you need to begin adapting and changing to findings of formative evaluation?
Difficult to make changes quickly to eval and research
--DoD looking for all kinds of information, but IRB is a challenge (18 month IRB process for working with contractors; hard to be responsive and provide real-time feedback)
Evaluation is constantly “in your face”
--Getting everyone to think evaluatively is challenging
Platform limitations for reaching people on bases;
--Limits innovation
Military culture (private, formal);
Doubling in size in 2015—lots of work to get done and most days are about just getting through the day
Real time innovation is challenging
--We’re very good at innovating and request of our partners; not as nimble for us to innovate on our own (still top down in that way);
--PIs bought out for only a small portion of time; innovation and change is slow, may not have everyone’s buy-in
Challenge: We were really in program implementation/improvement mode for the first four years. How do you introduce the possibility that you are no longer in program improvement mode but in fact that program needs to be redesigned? How do you prepare the people you’re working with for the possibility that you are no longer in program improvement mode but rather that you need to begin adapting and changing to findings of formative evaluation?
Difficult to make changes quickly to eval and research
--DoD looking for all kinds of information, but IRB is a challenge (18 month IRB process for working with contractors; hard to be responsive and provide real-time feedback)
Evaluation is constantly “in your face”
--Getting everyone to think evaluatively is challenging
Platform limitations for reaching people on bases;
--Limits innovation
Military culture (private, formal);
Doubling in size in 2015—lots of work to get done and most days are about just getting through the day
Real time innovation is challenging
--We’re very good at innovating and request of our partners; not as nimble for us to innovate on our own (still top down in that way);
--PIs bought out for only a small portion of time; innovation and change is slow, may not have everyone’s buy-in
Challenge: We were really in program implementation/improvement mode for the first four years. How do you introduce the possibility that you are no longer in program improvement mode but in fact that program needs to be redesigned? How do you prepare the people you’re working with for the possibility that you are no longer in program improvement mode but rather that you need to begin adapting and changing to findings of formative evaluation?
Difficult to make changes quickly to eval and research
--DoD looking for all kinds of information, but IRB is a challenge (18 month IRB process for working with contractors; hard to be responsive and provide real-time feedback)
Evaluation is constantly “in your face”
--Getting everyone to think evaluatively is challenging
Platform limitations for reaching people on bases;
--Limits innovation
Military culture (private, formal);
Doubling in size in 2015—lots of work to get done and most days are about just getting through the day