Your SlideShare is downloading. ×
0
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Packard Foundation OE Peer Learning Group
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Packard Foundation OE Peer Learning Group

859

Published on

Published in: Technology
0 Comments
3 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
859
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
0
Comments
0
Likes
3
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Welcome. This is the very session for this project and I’m thrilled that you have decided to participate in this learning journey. I look forward to learning a lot from you. Today’s call is an orientation to the program and an opportunity for you to ask questions.Give my gratitude the The David and Lucile Packard Foundation for supporting this project and my work …
  • Every few minutes as we get started, tech support reminder, type into the chat, roll call
  • http://www.flickr.com/photos/malinki/2621920871/sizes/o/Start recording about 2 minutes late to let people join *2
  • Welcome. This is the very session for this project and I’m thrilled that you have decided to participate in this learning journey. I look forward to learning a lot from you. Today’s call is an orientation to the program and an opportunity for you to ask questions.Give my gratitude the The David and Lucile Packard Foundation OE program for supporting this project and my work …Every project I do with the Packard Foundation is filled with learning. Last year’s group produced the book “Measuring the Networked Nonprofit” -- and some of you were a part of those sessions.This year, I am refining the curriculum, but more to the point a measurement system for improving this program. I am going to be very transparent about measurement – it isn’t about grades or anything scary – it is about making this peer learning program better, improving my skills designing and delivering – and more importantly being able to understand any transformational changes in your practice in a more scientific way.These sessions will help develop a more science tific approach to the CWRF framework – and I will explain that later on.
  • This is our agenda – we’ll pause along the way for questions.
  • Here’s a little bit about me – blogger, author, trainer.A lot of my work lately has been designing and facilitating peer learning networks about becoming networked nonprofits and social media– the photo there is a cluster of Packard Fdn. Grantees that focus on family planning … I was in Delhi in June for the start up – an intensive boot camp, followed by remote assistance. There’s were great lunches there, so to avoid people falling asleep … I made them move. The hotel had beautiful three story staircase and they had do laps … so if you do training – incorporating movement and interaction helps people learn and we’re going to do a lot of that today!
  • Official Welcome (10 minutes) Program Overview: Orient participants to the four days and overall program, including expectations (10 minutes)  Exercise (25 minutes)Facilitator asks the group to form 4 groups of 4 people each – social media implementers in two groups, and senior leaders in two groups. Each group will meet over the next 10 minutes. Their task is to reflect together on the following question: What are your hopes for this program and any fears or concerns you have for it as well? After the discussion period, each group will have 3 minutes to share their group's hopes and fears. At the conclusion, facilitator asks the group for their comments, observations and reflections on the whole to debrief.
  • Because this is our first call, I’m going to run down the names of the organizations and have everyone say their name – so we can hear each other’s voices. I won’t be doing this for every call – we will get into grove where you should arrive 5-10 minutes early and announcement yourself. If you arrive after the call is underway, let us know you are hear by using the chat.
  • https://www.surveymonkey.com/s/Measure-Netnon-Grp-1-LOA
  • I’ve learned that it is important that everyone understand that these sessions aren’t just about content-deliveryWe are using a “Ready Set Go” model … with an emphasis on the “GO”That means you all will be doing action learning projects and sharing what you put into practice.I feel strongly that this method is what builds capacity - -not for all – but for manyYou will be part of the content – I will deliver some content on the calls – and Stephanie and I will be sharing content/links/resources in the Facebook Group – but all of you have the capacity to do develop your measurement skills.
  • Each session will include the following related to each best practice: Framework Examples Additional How To Resource Wiki will have links and resources as well as links to notes from call Hub for Journals and Over the Shoulder Learning Wiki will be updated with resources suggested or used by participants during the calls or office hours
  • My first step was to follow the instructions in Chapter 5 about coming up with a theory of change …. This is a process of saying so what who what ….I originally said, Grantees will get more likes on Facebook, but pushed my self to say – so what so what …Anyway, the ultimate goal is .. But you see this is a six months program … read the slide
  • I was lucky enough to discuss this with the Packard Foundation evaluation team … the advice I got was that this was reasonable measurement …The first outcome is more of an activity – I’ve found that if participants do the work, they improve their skillsBased on my experience 50% is a reasonable completion rateI am also hoping to capture 6 case studies – that illustrate the work we’ve done that can be shared on my blog and presenting during our learning culmination call or along the way. This is where the office hours come in …I’m using a process called “Baseline” – so I establish where you all are at the beginning and then I do the same survey at the end and compare. This is not necessarily a beginner approach …. The baseline is based on the CWRF and I’ve been working on over several years, testing and reiterating – many thanks to Packard Foundation’ s support to do this .. This year trying to quantify the transformation in skills or knowledge.The measures need to further validating, so we’re still testing.One thing I should note, is that measures are not a report card for you .. If you got 1.2 on something and ended up with 1.5 – that’s progress … it doesn’t matter that someone else got a 2.1 .. So the numbers are not value judgements.What this does give me a benchmark for the group, helps me customize the content, etc.
  • The action learning projects are very critical to the success of the program .. So I will be measuring
  • The maturing of practice framework includes looking at 7 best practice areas for networked approaches and social media – and some specific indicators – and looking at what they look at the different maturity levels. If you remember the application form, it asked you questions and that’s how I came up with the scoring system. If you were “crawl” you got 1, Walk 2, Run 3, and Fly 4 – and then I average the scores for the group. I also could come up with a score for your organization overall.So, if you got a 1.5, it means that you are on your way to walking.https://docs.google.com/spreadsheet/ccc?key=0AtsV5h84LWk0dFhENWFXVzBwZ2lWOGlzazZSek5Iemc#gid=1
  • These are the measurement indicators that were on the baseline survey and have been used to guide the content of the peer learning groupsBut since the focus is on improving your measurement skills, I’m only go to measure and report on the three measurement indicators …….
  • For your action learning projects, it will be important for crawlers – to set up a regular system and discipline – don’t take on too big a project .. Make it small winFor those walking, important to get everyone’s input …. I know this can be tricky – but we are here to support you.
  • This might require a leap in terms of your budget – if necessary …… one of our sessions will be professional tools – and I’m going to survey you all = and find out what you’re using and then have you share your knowledge about it ….
  • This is really important – how you make sense and apply the data …….
  • The maturing of practice framework includes looking at 7 best practice areas for networked approaches and social media – and some specific indicators – and looking at what they look at the different maturity levels. If you remember the application form, it asked you questions and that’s how I came up with the scoring system. If you were “crawl” you got 1, Walk 2, Run 3, and Fly 4 – and then I average the scores for the group. I also could come up with a score for your organization overall.So, if you got a 1.5, it means that you are on your way to walking.https://docs.google.com/spreadsheet/ccc?key=0AtsV5h84LWk0dFhENWFXVzBwZ2lWOGlzazZSek5Iemc#gid=1
  • http://www.flickr.com/photos/cgc/5259321/sizes/m/
  • Transcript

    • 1. Peer Learning Group 1: Measuring the Networked Nonprofit: Action Learning Project Orientation Call: Jan. 22, 2013 Beth Kanter,Visiting Scholar, Social Media and Nonprofits The David and Lucile Packard Foundation Organizational Effectiveness Program
    • 2. Welcome! If you experience any technical difficulties logging into the system, please contact Ready/Talk Customer support: 800.843.9166 Please use *6 to Mute your conference line While we are waiting, play with the chat: Type in your Only the name, organization, and location.moderator What are greatest hopes forcan see you participating in this program? chats Greatest concerns?
    • 3. This call is beingrecorded *2 Flickr Photo by Malinki
    • 4. Peer Learning Group 1: Measuring the Networked Nonprofit: Action Learning Project Orientation Call: Jan. 22, 2013 Beth Kanter,Visiting Scholar, Social Media and Nonprofits The David and Lucile Packard Foundation Organizational Effectiveness Program
    • 5. Agenda • Intros • Program Overview • Maturity of Practice Next Session • Reflection Tweet your Only the insightsmoderator can #netnonsee you chats
    • 6. Beth Kanter
    • 7. Stephanie Rudat
    • 8. Participants: Roll Call AAPIP American Civil Liberties Union American Leadership Forum - Silicon Valley Arts Council Silicon Valley Community Foundation Santa Cruz County *7 unmute COMPASS * 6 mute Exhale GlobalGiving Grantmakers for Effective Organizations Ibis Reproductive Health International Womens Health Coalition Kuumbwa Jazz Leadership Learning Community Leopold Leadership Program Marine Science Institute PACT Population Action International Roots of Change Stanford Social Innovation ReviewThe Encore Fellowships Network (hosted by encore.org) United Way Silicon Valley Upwell, incubated by Ocean Conservancy WildAid Young Invincibles
    • 9. Notifications Sign Up More than one person per organization can participate and tag team the work, but whoever joins the call must be preparedhttps://www.surveymonkey.com/s/2013-measure-netnon-grp1-notify I will also be sending out calendar invitations for the rest of the calls so you can add it to your calendars. Note, for some folks, it might not work depending what you use for your calendar system and your IT set up. Auditors http://measure-netnon.wikispaces.com/Participants
    • 10. Participation Expectations2-8 hours per monthLetter of Agreement:https://www.surveymonkey.com/s/Measure-Netnon-Grp-1-LOA• More than one person per organization can participate and tag team thework, but whoever joins the call must be prepared• Attend conference calls and participate in Facebook Group• Participants will self-define their “homework” related to the topic of the calland their action learning projects• Each organization will have a “wiki” journal for notes during the program and“look over the shoulder learning” is encouraged (and important part of mymeasurement plan)• Beth will publish a regular blog post summarizing the best practices or a casestudy• Beth will hold regular offices for small groups or one-on-one remedialassistance for action learning project, prep presentations, or case study
    • 11. Conference Call Schedule and Topics Date/Time Topic January 22 Orientation Maturity of Practice Assessment Action Learning Project February 4 Basic Measurement Steps Becoming Data Informed Action Learning Projects - Questions February 25 Measuring Engagement Action Learning Projects Identified March 18 Measuring Influence April 15 Measurement Tools and Dashboards * May 29 Making Sense of Your Data ** June 24 Culmination • All calls at 1 PM PST – except for 1/22 • All calls one hour – except for 6/24 • All calls on Mondays except 5/29 due to holiday
    • 12. Peer Learning Conference Calls: Structure Check In Next Topic Action DiscussionCall-In:866-740-1260 passcode: 740-5939http://www.readytalk.com passcode: 740-5939
    • 13. The Wiki http://measure-netnon.wikispaces.com/
    • 14. Office Hours: Optional • Coaching for Action Learning Projects • Coaching for Presenting on Call • 30 minute sessions • https://my.timedriver.com/4QHZG
    • 15. StealthMeasurement Closed Facebook Group
    • 16. Action Learning: Project • Apply the 7 steps of measurement • Project should focus on smaller, doable activity or campaign to measure • Project should ideally start and finish by June, 2013 • Don’t need to measure everything about it • Go for the easy win • Priority in work plan • Keeps ED up at night • Area of practice to improve
    • 17. Action Learning Project Types Single Program/Event/Channel Formal Ladder of Engagement Model Brand Monitoring Influencer Researcher Evaluation and Selection of Tools Improve Internal System – Dashboard, Reporting, Sense-Making Prelude Project: Benchmark or Research Identifying An Action Learning Project Deliverable • Design complete by March 1st or sooner • Implementation – March 1- June 1st • Presentation – June 24th
    • 18. Peer Learning Program: My Use of MeasurementI’m eating my own dog food so I have empathy for you!
    • 19. Grantees communicationsstrategies have more impact on policy and social change Theory of Change outcomes Grantees have better relationships with influencers , partners, and stakeholders Grantees learn from each other’s experiences, saving time and getting better results Grantees get better at social media integration strategy and measurement and learning discipline Grantees implement action learning pilots and share learning with each other
    • 20. Peer Learning Program Outcomes • 50% (12) of participants design, implement, and document an action learning project that improve results of social media strategy through measurement • Participants generate six case studies of how nonprofits can measure social media effectively that are published on Beth’s Blog and/or presented during a call • 50% (12) of participants improve baseline level of maturity for one or more CWRF Measurement Indicators and average for group increases by .5 point
    • 21. Ladder of Engagement: Action Learning Projects Case Study 25% (6) Project Finished 50% (12) Implementing, Not Finished POLL Example Designed, Not Implementing Not Started Measurement Plan: Webinar Polls
    • 22. Maturity of Practice: Crawl-Walk-Run-Fly CRAWL -1 WALK-2 RUN-3 FLY-4Categories Practices AverageCULTURE Networked Mindset 2.17 Institutional Support 2.21CAPACITY Staffing 2.13 Communications Strategy 2.21MEASUREMENT Analysis 1.67 Tools 2.21 Adjustment 2.38LISTENING Brand Monitoring 1.58 Influencer Research 1.46 ENGAGEMENT Ladder of Engagement 1.46 CONTENT Integration/Optimization 1.50 NETWORK Champions 1.08 Relationship Mapping 1.54 All Indicators for the Entire Group Average: 1.81 Spreadsheet: http://bit.ly/spreadsheet-group-1-netnon-2013
    • 23. Measurement Indicators: 50% improve at least one by .5 Score: 1.67 Score: 2.21 Score: 2.38
    • 24. Measurement Indicators: Data Informed Crawl: Lacks consistent data collection or formal reporting. Draws conclusions from incomplete data or “drive by” analysis. Walk: Data collection is consistent, but not shared between departments. Not all data is linked to decision-making for better results. Run: Data is from multiple sources and shared across departments through a dashboard. Does not collect data it doesn’t use. Measurable objectives are based on benchmarking. Fly: Establishes organizational KPIs and tracks in organizational dashboard with different views for departments or levels. May have data analyst on staff. Score: 1.67
    • 25. Measurement Indicators: Data Collection Tools Crawl: Using free or low cost analytics tools to collect metrics and analyze further in spreadsheets if required for actionable insights. Walk: Using free/low cost analytics tools to collect metrics and analyze further in spreadsheets if required for actionable insights. Run: Uses social media management/metrics professional tool and free tools to collect data and analyze further in spreadsheets if required for further actionable insights Fly: Uses professional measurement and analytics tools. Provides training or uses expert consultants to assist in data/analysis. Score: 2.21
    • 26. Measurement Indicators: Sense-Making Crawl: Does not use data to make planning decisions. Walk: Uses data for decision-making but not a formal organizational process. Run: Reports are discussed at staff meetings and used to make decisions that improve results. Fly: Formal process for analyzing, discussing, and applying results. Data visualization and formal reflection processes. Score: 2.38
    • 27. Maturity of Practice: Crawl-Walk-Run-Fly CRAWL -1 WALK-2 RUN-3 FLY-4Categories Practices AverageCULTURE Networked Mindset 2.17 Institutional Support 2.21CAPACITY Staffing 2.13 Communications Strategy 2.21MEASUREMENT Analysis 1.67 Tools 2.21 Adjustment 2.38LISTENING Brand Monitoring 1.58 Influencer Research 1.46 ENGAGEMENT Ladder of Engagement 1.46 CONTENT Integration/Optimization 1.50 NETWORK Champions 1.08 Relationship Mapping 1.54 Quick Fly Over ……
    • 28. Maturity of Practice: CWRF - Culture CRAWL WALK RUN FLY ScoreNetworked Understanding of Listening to and Comfort level with Leadership is 2.17Mindset networks that are cultivating greater comfortable using connected to relationships with organizational decentralized organization networks based on openness and decision-making and mapping networks. transparency. collective action with Leadership is using networks. Considers social networks and people inside and comfortable with outside of the showing organizations as personality. assets in strategy.Institutional Social media policy Social media policy Social media staff All staff use social 2.21Support is drafted and has been discussed position includes media effectively to gaining support and approved by facilitating training support organization through “road leadership. other staff to use objectives. shows” with social networks. departments
    • 29. Maturity of Practice: CWRF -Capacity CRAWL WALK RUN FLY ScoreCommunications Consideration of Strategic plan with Strategic plan with Strategic plan with 2.13Strategy communications SMART objectives SMART objectives SMART objectives and strategy with SMART and audiences for and audience audience definition. objectives and branding and web definition. Includes integrated audiences and presence, include Includes integrated content, engagement strategies for strategy points to content, strategy, and formal branding and web align social media forengagement champions/influencer presence. Social one or two social strategy, and program and working Media is not fully media channels. formal with aligned partners. aligned. champions/influen Uses more than three cer program and social media channels. working with Formal process for aligned partners. testing and adopting Uses more than social media channels. two social media channels.Hours 5 hours or less per 5-19 hours per week 20-29 hours per 30-40 hours of staff 2.21 week of staff time is of staff time is week of staff time time is invested in a invested invested in one in a dedicated dedicated social media position. Other staff social media position with support or intentions position. Other staff. Other staff or implement social staff or interns or interns or influencers media. influencers implement social implement social media. media strategy.
    • 30. Maturity of Practice: CWRF - Measurement CRAWL WALK RUN FLY Score Analysis Lacks consistent data Data collection is Data is from multiple Establishes collection or formal consistent, but not sources and shared organizational KPIs and 1.67 reporting. Draws shared between across departments tracks in organizational conclusions from departments. Not all through a dashboard. dashboard with incomplete data or data is linked to Does not collect data it different views for “drive by” analysis. decision-making for doesn’t use. departments or levels. better results. Measurable objectives May have data analyst are based on on staff. benchmarking. Tools Not using or not using Using free or low cost Using free/low cost Uses professional fully. analytics tools to analytics tools to measurement and 2.21 collect metrics and collect metrics and analytics tools. analyze further in analyze further in Provides training or spreadsheets if spreadsheets if uses expert required for actionable required for actionable consultants to assist in insights. insights. Uses social data/analysis. media management/metrics professional tool to collect data. Adjustment Does not use data to Uses data for decision- Reports are discussed Formal process for make planning making but not a at staff meetings and analyzing, discussing, 2.38 decisions. formal organizational used to make and applying results. process. decisions that improve Data visualization and results. formal reflection processes.
    • 31. Maturity of Practice: CWRF - Listening CRAWL WALK RUN FLY Score Brand Observing Tracking keywords, Tracking keywords,Tracking keywords, 1.58 Monitoring conversations and influencers, or influencers, and influencers, and receiving Google conversations using conversations using conversations using Alerts, but not doing free tools, but does free tools and free and paid tools analysis not have a formal weekly/monthly and weekly/monthly organizational reporting and reporting and process for synthesis. synthesis. Capacity synthesis and to use “real-time” reporting. information to respond. Uses both to make decisions, avoid social media crisis before escalating. Influencer Not using Uses online systems Uses online systems Uses online systems 1.46 Research and “desk research” and “desk research” and “desk research” to identify, but is to identify, monitor, to identify, monitor, not monitoring. and cultivate. and cultivate and to build an influencer strategy.
    • 32. Maturity of Practice: CWRF - Content CRAWL WALK RUN FLY Score Integration Shares content Uses an editorial Uses an editorial Uses an editorial 1.46 and that may be calendar to align calendar to align calendar to align Optimization relevant to content with content with content with audience, but not objectives and objectives and objectives and consistently and audiences to audiences to audiences to not measuring publish across publish across publish across channels channels channels consistently consistently and consistently, measures measures performance performance, and uses data to plan content
    • 33. Maturity of Practice: CWRF - Engagement CRAWL WALK RUN FLY Score Ladder of Not using Informal Formal description Formal description 1.50 Engagement description of of different levels of different levels different levels of of engagement of engagement engagement on based on survey or based on survey or different platforms qualitative qualitative or across platforms, research. Aligns research. Aligns but doesn’t align with strategy, but with strategy and with strategy or does not collects data and measurement. measurement reports organized process for all by engagement and steps. conversion levels.
    • 34. Maturity of Practice: CWRF - Networking CRAWL Score WALK RUN FLY Champions Has partners but Connects and Consistent Consistent 1.08 is not collaborates with conversations and collaborations with collaborating on aligned partners inconnections with aligned partners on social networks. a haphazard way, aligned partners on social channels not consistent or social media with activities that strategic. platform(s) and are mutually implements small aligned with pilots. objectives. Relationship Lists Uses low tech Uses low tech Uses low tech 1.54 Mapping organizations or methods (drawings methods and free methods and free partners but has and sticky notes) to social network and paid social not visualized or visualize networks analysis tools to network analysis identified new of individuals and visualize networks tools and uses ones. organizations of individuals and resulting organizations. Uses visualizations to data to inform inform strategy strategy and and/or measure tactics. results.
    • 35. Maturity of Practice: Reflection • What is unclear? Questions? *6 mute Type into Chat * 7 unmute Only the moderator can see you chats
    • 36. Next Session Next Session: Feb 4: 1:00 pm PST Measuring Networked Nonprofit – Finish Reading Chapters 1-4 Use Assessment and Action Learning Checklist: • What area of your social media practice do you want to measure and improve? • What small measurement pilot might help your organization the most? Email me if you want do get some peer coaching on the next call about your project design: bkanter@packard.org

    ×