Sharing 6 attributes of mutually beneficial volunteering program, 6 drivers for volunteer satisfaction and results of the ASAE Foundation volunteer research plus association examples.
3. Research
Overview
• Phase 1: Associations
• Part 1 – Association Staff Survey |
1,016 unique association
respondents
• Diverse group …. 39% trade
associations, 43% individual
member organizations, 14%
hybrid, 4% other
organizations
• Part 2 – In-depth Phone interviews
with Staff from 75 Associations
• Phase 2: Members
• 50 associations | 25,250 member
respondents of 121,000 surveyed
(21% response)
• Diverse group … physicians,
nurses, pharmacists, mental
health professionals,
educators, real estate
professionals, architects,
trade associations, etc.
• Phase 3: in-depth audit of key
performance metrics
4. Mutually Beneficial Volunteer System:
a system in which the volunteer makes a
meaningful contribution to the organization’s
mission and the management process makes
a meaningful contribution to the volunteer’s
professional development and personal
satisfaction
5. Mutually Beneficial Volunteer System:
a system in which the volunteer makes a
meaningful contribution to the organization’s
mission and the management process makes
a meaningful contribution to the volunteer’s
professional development and personal
satisfaction
7. How would you rate your
volunteer management
system in these 6 areas?
8. 6 Drivers
quality of staff who
serve as
liaison/coordinators of
their activity
receptivity of staff to
give their input
appropriate
consideration
the quality of
orientation/introduction
quality of their volunteer
leadership
ability to debate/discuss
issues
time & timing
9. How does your association
rate in delivering on these
drivers?
10. Mutually Beneficial Volunteer System:
Scoring Tool
1. Survey staff and volunteers.
• Parallel construction
• 5-point scales
2. Calculate individual, section and overall averages.
3. Subtract volunteer average from staff average.
4. Analyze gaps at individual attribute level; section
level and overall level.
11. Job Design | Meaningful work
Chief among these are job design that
doesn’t offer flexible scheduling and
effective use of experience
12. Job Design | Meaningful work
•FIT TO
MISSION
GOALS RESOURCES FLEXIBLE
ACCESSIBLE
FIT TO
MEMBERS
TASKS TO BE
PERFORMED
TIME
REQUIRED
KSAs
REQUIRED
APPROPRIATE
13. Job Design | Example
• Volunteer Committee –
Lead!
- Activity engaging
- Developing short-terms
opportunities
- Working with
committees on
developing positions
Board sets
Committee
Charter
Committee ids skills
needed,
commitments
Volunteer
Committee
recruits &
refers
16. Recruitment/Selection | Right Person-Right Job
• Staff:
• Nearly 50% have to accept volunteers who are
not as committed or qualified
• More than 30% have some/many volunteers not
well-suited for roles they serve
• Volunteers/Non-Volunteers:
• Rate selection process low
Low grades from staff and volunteers
20. Assessment | Constructive Feedback
• Members: Seeking more feedback – area of lowest satisfaction
• Top ranked items for improvements: Better sense of how their work fits
into the big picture
• Clearer expectations re a job well done
• More feedback!
10% Evaluate
28% say lack of assessment leads to
unqualified volunteers
21. Assessment | Constructive Feedback
•WHO ASSESSES? HOW OFTEN?
WHAT ARE THE CRITERIA?
HOW ARE THEY SCORED?
HOW ARE THE EVALUATIONS
RECORDED AND STORED?
HOW OFTEN ARE THE
EVALUATIONS REPORTED?
TO WHOM?
HOW IS
UNDERPERFORMANCE
ADDRESSED?
WHO REVIEWS THE
ASSESSMENTS/ASSESSORS?
22. Assessment | Example
•Chair evaluates
committee member
VPs evaluates
committee member &
chair
Scores averaged
Feedback
• Self-assessment
• Some done on a 1:1 basis
(by req)
• Just don’t get invited back
24. How do you assess and
evaluate volunteers and/or
volunteer groups?
25. | Fully Developing Your Volunteer Workforce …
What we looked at …
• Defined Mutually Beneficial Volunteer System
• Explored 6 attributes & drivers
• Shared a scoring tool
• Addressed 3 pain points (job design,
recruitment/selection, assessment)
26. Contact Us
Peggy Hoffman, CAE
President | Mariner Management
phoffman@marinermanagement.com
301.725.2508 | @peggyhoffman
Kevin Whorton
President | Whorton Marketing &
Research
industrysurveys@kwhorton.com
202.258.9889
Editor's Notes
Our research strongly suggests volunteers create value for the association in a number of areas, some of which can be objectively quantified and potentially monetized.
Table conversation
Table conversation
Tool is an extension and application of the research
When looking at perceptions comparing volunteers & non-volunteers, the areas of largest gaps focus on areas related to job design – that is time commitment, timing and complementing job
We asked non-volunteers (almost a third of the respondents) what kept them from volunteering and compared their responses to volunteers’ satisfaction with the same areas to identify gaps that might be filled with improvements to the volunteer management system. The top areas are primarily those related to job design.
Chief among these are job design that doesn’t address time constraints and convenience
Assessment: Elephant in the room!
Are the volunteers doing a good job? How do you know? What are the metrics?
Is it OK to assess? If so, who does the assessing?
What if they aren’t doing a good job? Can they be fired?
10% formally evaluate volunteers and 28% indicated that the lack of an effective assessment process resulted in volunteers in roles for which they were unqualified.
Assessment: Elephant in the room!
Are the volunteers doing a good job? How do you know? What are the metrics?
Is it OK to assess? If so, who does the assessing?
What if they aren’t doing a good job? Can they be fired?
Assessment history for some going back to 1970!
Tool for ratings has changed over time.
Member is evaluated by the member they report to.
All are rated – registration, welcome, room monitors, etc.
Current Practice
1-5 with anchors
Show up?
Respectful/cooperative
Do the job
Done with index cards
Results are not shared w/volunteer – kept confidential – we suggest the member reflect back on their performance; we share the form but not the results
http://www.ahima.org/volunteers?tabid=assessment
National Volunteer
HL: We’re still new (1 yr)
Does have TF – serve 1 yr term, on-demand
400 volunteers + elected CSA leaders (P/PE/T/SA/SL/Awareness) brings >1000. Just starting with tracking
70k members – but seen growth. 75 new volunteer members, 4 new cmtes and Adhoc. We hadn’t been tracking Ad-hoc and not recognizing.
Trying to get fresh blood
Volunteer Recognition – really haven’t done much due to AMS
NVW – offer a discount during that week on P&S
What’s their big pain
New volunteers, getting people to step up
WV president for 3rd; not a small state or large
Developing volunteers to progress ad-hoc, appointed, elected
How bring in, hook to keep and deepen engagement
Competencies tool
Competencies – how built? http://www.ahima.org/volunteers?tabid=assessment
Encourage CSA officers
Use extensively as national level; nomcom in interviewing and draw from this
Developed by volunteer TF – first didn’t have anything HIM but added.
Individuals have used to say I need more; used with delegates in helping them id competencies – shifted them to be focused on environmental scanning for example
Just an HTML