Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Social Media Evaluation Webinar for NAEPSDP

844 views

Published on

Slides from a webinar by Sarah Baughman & Brigitte Scott for the National Association of Extension Professional and Staff Development Professionals.

Published in: Social Media
  • Be the first to comment

  • Be the first to like this

Social Media Evaluation Webinar for NAEPSDP

  1. 1. Evaluating Social Media in Extension Programming National Association of Extension Program and Staff Development Professionals October 21, 2014 Sarah Baughman, Ph.D. & Brigitte Scott, Ph.D. Military Families Learning Network Virginia Tech Sarah Baughman, Ph.D. eXtension
  2. 2. Photo credit: Douglas Wrey on http://www.geek.com/wp-content/uploads/2012/02/social_media_donut.jpg
  3. 3. Ways to Use Social Media
  4. 4. Ways to Use Social Media Annual Report from Calgary Zoo
  5. 5. Evaluating a “stand alone” social media campaign: Ohio State Kitchen & Campus Dairy Campaign, Jamie Seger
  6. 6. Theory of Change
  7. 7. • Support F2F workshops with information on Facebook • Use a FB page to encourage discussion on educational information presented F2F • Provide FB incentives for person who increases the most or tries something new • Have participants each take a different day to share one recipe they have tried or something from their food log that is working (or maybe not working) • Invite family members to the FB page to encourage participants •Highlight a different family member every week and how they have helped support healthier eating
  8. 8. Conversion Engagement Reach
  9. 9. Where’s the Data? Resources for Measuring and Evaluating Social Media  Facebook  Insights  Twitter  Tweetreach  Hootsuite  Tweepsmap  Buffer  Mentionmapp  Pinterest  Tailwind  Multiplatform  Sproutsocial  SimplyMeasured  SumAll  Google Analytics
  10. 10. Evaluation Keys  Clear purpose  Clear goals  Consistent metrics and measurement over time  Use the measurement data to learn and improve
  11. 11. Contact Information Sarah Baughman  540-231-7142  baughman@vt.edu  @programeval  Gplus.to/SarahBaughman
  12. 12. Getting to the Why behind the Numbers CC by 3.0
  13. 13. You are already doing qualitative analysis. CC by 3.0
  14. 14. From TweetReach…. From Facebook Insights….
  15. 15. Photo by LauraGilchrist4 - Creative Commons Attribution-NonCommercial-ShareAlike License https://www.flickr.com/photos/76060406@N07 Created with Haiku Deck
  16. 16. Basic Text Analysis
  17. 17. Basic Text Analysis: Inductive Use data to discover concepts, themes, or models
  18. 18. Basic Text Analysis: Inductive Use data to discover concepts, themes, or models Evaluator as interpreter; highly involved
  19. 19. Basic Text Analysis: Inductive Use data to discover concepts, themes, or models Evaluator as interpreter; highly involved Emergent, “bottom up”
  20. 20. Basic Text Analysis: Inductive Use data to discover concepts, themes, or models Evaluator as interpreter; highly involved Emergent, “bottom up” Qualitative outcome: key themes or categories relevant to evaluation/research questions
  21. 21. Application: Inductive Analysis  Facebook posts  Tweetchats or hashtags  Blog posts  LinkedIn discussions
  22. 22. Basic Inductive Analysis: 5 Steps
  23. 23. Step 1. Collect your raw data
  24. 24. Step 2. Read. And Read Again. And: Get organized!
  25. 25. Step 3. Create and Apply Codes (Repeat.)
  26. 26. Step 4. Refine Codes to Reduce Overlap
  27. 27. Step 5. Create Categories
  28. 28. . . Narrative Analysis (Step 6, Really.)
  29. 29. Basic Text Analysis: Deductive Data is analyzed according to prior assumptions
  30. 30. Basic Text Analysis: Deductive Data is analyzed according to prior assumptions Evaluator is “independent” from data
  31. 31. Basic Text Analysis: Deductive Data is analyzed according to prior assumptions Evaluator is “independent” from data A-priori; “top down”
  32. 32. Basic Text Analysis: Deductive Data is analyzed according to prior assumptions Evaluator is “independent” from data A-priori; “top down” Quantitative outcome: metrics relevant to evaluation/research objectives
  33. 33. Application: Deductive Analysis  Category comparison, comparison over time  Analyzing webinar chat pods  Analyzing how a hashtag is leveraged in Tweets  Facebook/LinkedIn audience engagement
  34. 34. Basic Deductive Analysis: 4 Steps 1. Develop data categories. 2. Clearly define those categories. 3. Read through all raw data and apply categories. 4. Count.
  35. 35. Chat Pod Engagement Metrics 21 0 17 10 5 0 5 10 15 20 25 Unique participant to participant exchanges Participant questions Resources shared by MFLN Resources shared by participants Unique chat pod participants
  36. 36. The fine print…. Only DCO viewers can participate in the chat pod; percentage of chat pod participants based on total number of DCO viewers and total number of unique participants. Resources shared by participants include shared links, authors, studies, books, etc.; demonstrates high-level engagement because participants are contributing to the co-construction of knowledge during webinar. Resources shared by MFLN include links, peer-reviewed studies and books, etc., from both MFLN and non-MFLN authors; demonstrates direct CA engagement with participants by further supporting and contextualizing knowledge construction by situating webinar presentation within the larger disciplinary area. Participant questions are those listed in the chat pod; demonstrates intent to pursue two-way engagement in webinar and therefore high-level engagement. Unique participant to participant exchanges are those in which chat pod participants respond directly to one another’s comments; demonstrates high-level engagement through realized reactive (two-way) and interactive (dependent) discourse patterns. Chat pod text related to webinar content is not captured as an engagement measure due to its discursive category as declarative (one-way) communication. (It is noted, however, that declarative text is still understood to indicate webinar engagement, and MFLN encourages and values such participant engagement.) Chat pod text related to technical issues and/or CEUs is not included in MFLN evaluation.
  37. 37. Storytelling
  38. 38. Storytelling Identify narratives that connect to your evaluation aims
  39. 39. Storytelling Identify narratives that connect to your evaluation aims Be strategic and leverage stories for evaluation task at hand
  40. 40. Storytelling Identify narratives that connect to your evaluation aims Be strategic and leverage stories for evaluation task at hand Contextualize your stories with other data to show a larger picture
  41. 41. Storytelling Identify narratives that connect to your evaluation aims Be strategic and leverage stories for evaluation task at hand Contextualize your stories with other data to show a larger picture Ethics, ethics, ethics
  42. 42. Storytelling: How? Watch for stories ASK for stories Tell your own stories
  43. 43. From the Master Gardeners… “On a Celebrex commercial a guy is shown bent over in some beets or chard and he raises up with a beautiful eggplant! The first time I laughed at it my wife thought I was crazy.”
  44. 44. Application: Storytelling and Evaluation  Use stories in your reports, and include an executive summary of those stories  Incorporate compelling stories with facts and figures  Include stories with direct quotes in press releases, on Web sites  Include stories and quotes in newsletters, brochures, annual reports
  45. 45. Larger Considerations
  46. 46. Larger Considerations Reflexivity Transparency Credibility Ethics
  47. 47. Contact Information Brigitte Scott  540-231-3990  brigitte.scott@extension.org  @4ed_eval

×