This document provides guidance on assessing association programs that are failing, flagging, or floundering. It outlines a 5-step process: 1) Define goals of the assessment; 2) Conduct research on the program's history, impact, and data; 3) Measure the program's impact and outcomes; 4) Analyze the program's efficiency and costs; 5) Implement the assessment with key stakeholders and communicate the results. Throughout, it emphasizes identifying the program's original goals, understanding costs, measuring impact both quantitatively and qualitatively, and involving stakeholders in the process to facilitate changes based on the findings.
Vip Dewas Call Girls #9907093804 Contact Number Escorts Service Dewas
Fix it or flee it: Proven approaches for dealing with failing, flagging and floundering association programs
1. Fix it or Flee it?
Proven approaches for dealing with
failing, flagging and floundering association programs
Greg Melia, CAE @gmeliacae
2. What We’ll Do Today
• Identify 5 steps essential to program
assessment
• Use real world examples to highlight pitfalls
to avoid and key points to remember
• Talk about real world issues and challenges
6. Defining the Goal of the Review
–Work with key stakeholders and
decision-makers
–Solicit their key questions:
• What do they want to know?
• What information will help their decision-
making?
• What do they think needs to be evaluated?
–Refine questions to be SMART
7. Original Program Aspects
Investigate the Key “Whys”:
•Why was the program originally established?
•Why has it continued to be offered?
•Why has it change over time?
Review the Original Intended Approach:
•Who, what, when, where, why and how?
•What was implemented?
•What was not?
8. Lessons Learned
• Verbalizing goals of the review are an
important part of preparing for change
• Seek superordinate goals
• Some agreements are easier reached
upfront
9. Research
3 kinds of data:
Corpus Inscriptionum
Opinions and
Descriptions
Imponderabilia of
behavior
Bronislaw Malinowski
Credit: Photographie et Ethnologie:
Les photographes français au XXème siècle
devant d’autres formes de cultures.
10. Lessons Learned
• Historical minutes and documents can be
VERY informative
• Seek to understand program mutations
• Good decisions based on bad data usually
give bad results
• Remember they may have been involved
11. Impact Measure
• Outputs = Immediate
I attended
• Outcomes = Short-term
I gained knowledge
• Impacts = Long-term
I served more members as a result
12. Lessons Learned
• Associations tend to over focus on outputs
and outcomes.
• Hard to argue “happy & full”, but sometimes
necessary.
• Is the program impact large and broad
enough to displace alternatives?
14. Efficiency
Staff and Volunteer Time
Complaints and Comments
Marketing/Communication
Registration/Orders
Technology Photo credit: mansikka on Flickr
15. Lessons Learned
• How do people feel about it? Do they have
suggestions to increase efficiency?
• What are the trends in terms of level of
effort?
• What is cost/benefit of doing it the same way
versus trying a new approach?
• Can improved efficiency save it?
16. Finances
• Direct fixed expenses
– Incurred regardless of how many people
participate, buy, or are served
• Variable expenses
– Incurred per each additional unit or person
that is serviced
• Mixed expenses
– Incurred per each additional set of units or
persons serviced
17. Types of Costs
• Direct fixed expenses
– Staff, Rent
– Purchased equipment
– Per event contracts
• Variable expenses
– Per person contracts
– Attendee gifts, handouts, etc.
• Mixed expenses
– Temporary help
– Office supplies
18. Key Calculations
• Revenue – Expense = Margin
– Margin per attendee/unit
– Margin per staff hour
– Margin per impact
19. Allocating Overhead Costs
• Equal allotment
• Proportional
allotment
– To budget
– To number served
– To staff hours
21. Example
• Revenue – Expenses = Margin
$12,000 - $2,000 = $10,000
• Margin per unit (solicitation):
$10,000/1,000 = $10
• Margin per staff hour:
$10,000/200 = $50
• Margin per impact (contributor):
$10,000/100 = $100
22. Lessons Learned
• One full flight is more profitable than two half
full ones.
• Most do not realize the true full cost.
• Understanding costs helps determine price.
• Price increases (and closing programs) take
time.
23. Tips on Implementation
• Internal Evaluators:
– Program knowledge & skills
– May create staff buy-in
– Less $$$ outlay
• External Evaluators
– Usually seen as more
objective
– Assessment expertise
– Comparative ideas
– More $$$ outlay
Photo credit: Matthew Burpee on Flickr
24. Implementation
• Who to involve
–Stakeholders
–Staff
–Non-users
• Who to interview
–Former volunteers
–Former staff
–Vendors
Photo credit:
kierenmccarthy.co.uk
25. Communication of Results
• Review purpose of review
• Review what was done
• Arrange key findings in logical
order, highlighting interpretations where
appropriate
• Close with recommendations or issues to
be addressed
Credit:
EE.UTD.Events
on Flickr
26. Lessons Learned
• Equivocal recommendations get equivocal
results
• Decision-makers need synthesized data, not
the opportunity for data analysis
• Survey data is important, but the proof is in
historical performance
• Plan and communicate transitions
Chances are that your association has at least one thing that you do that is facing dwindling involvement, underperforming finances, or sucking up staff resources. These programs often plod along year after year, maintained by the momentum of “but we have always done it that way.” Greg will share what he has learned from his involvement in the review of and intervention in nearly a dozen such situations over the past ten years. Come learn the key steps in evaluating such programs, and leave with an understanding of the steps you can take (and pitfalls to avoid) to revamp, revitalize, or even stop doing such programs. Sponsored by UGA Hotel and Conference Center
“ If you don’t know where you’re going, you’ll end up somewhere else” - Yogi Berra
S Specific M Measurable A Attainable R Relevant T Timely