Why aren't Evaluators using Digital Media Analytics?


Published on

Whether it’s through blogs, tweets, or even the comments section of an online newspaper, the world is increasingly talking online. However, the potential uses for the massive amounts of information available on the internet remain largely untapped in the sphere of evaluation.

This presentation will explore innovative methods to extract these insights from the large and complex collections of digital data publicly available online. In particular, we will examine the unprecedented uses, and potential limitations, of digital media analytics to:

• Measure the outcomes of public outreach, advocacy, communications, and information sharing programs;
• Establish current and retroactive baselines;
• Conduct “borderless” data collection to gain insights from other countries, as well as disapora communities in Canada;
• Identify unknown stakeholder groups and create detailed stakeholder maps; and,
• Provide context and insight to inform further data collection.

Published in: Technology, Business
  • Be the first to comment

  • Be the first to like this

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Why aren't Evaluators using Digital Media Analytics?

  1. 1. Why aren’t evaluators taking advantage of Digital Media Analytics? from social media to open web sources Giles Crouch, CEO giles@mediabadger.com Tasha Truant, Consultant Manager ttruant@ggi.ca
  2. 2. agenda  The Internet | Breaking Assumptions  What is Cyber Analytics?  Examples  Strengths and Challenges  Applications and Opportunities for Evaluation  Case Studies  MediaBadger + Goss Gilroy
  3. 3. the internet
  4. 4. dispelling some myths  Average age of social media users is 36  USA, Canada, UK, EU  The 55+ demographic is fastest growing  25% of all content uploaded is now mobile device  Connecting to social media & the web is not done at home from a desktop anymore  Approximately 40% of the developing world is on social media
  5. 5. what social media can tell us  Sentiment on issues (societal, political etc.)  Key influencers and their networks  Cultural behaviors and actions  (e.g. Kenya; local social media versus Facebook)  Myths and narratives on issues  e.g. Tar sands are acidic (in reality bitumen is less acidic than traditional oil and gas)  Important to understand myths and how they turn into narratives – very difficult to change narratives
  6. 6. what social media can tell us  Demographics  All social media apps record certain “meta data” such as location (geo) time and IP address  Keywords  When gender isn’t identified, use face recognition in from profile pictures  Over 60% of posts contain a person mentioning where they are from and items such as tribe (i.e. for First Nations) and often city/province  Combination of identifiers are used to create a user profile (like the Target example from yesterday)  Margin of error is between 4-6%
  7. 7. how we make sense of all this data
  8. 8.  examples of use applications
  9. 9. 2010 New Brunswick Flood Event Historical Trends | Communications Patterns | Citizen Behaviours • Examined how people used social media immediately before, during and after the flood event • Findings: Citizens paid very little attention to crisis communications from authorities • Instead, relied on each other, posting pictures and videos, warning people in forums etc. of where to go or not go • Result of research let to overhaul of emergency comms and use of social media
  10. 10.  a connected society Haiti n=3000 (!)
  11. 11.  strengths | challenges going forward
  12. 12. difference of approach Traditional (survey)  Limited Data  Time Snapshot  Specific Point  Curated & Pristine  Must be clean  n=xxxx  Defined, Limited  active bias (bias in asking the questions) cyber research  Massive Data  Historical Data  1985 Onward  Chaotic Accuracy  Messiness is OK  N=ALL  Exponential  passive bias (listening bias)
  13. 13. premise for use in evaluations  Evaluators spend a lot of time and money trying to find out how people perceive, and are affected by, programs, policies, and organizations  World is increasingly voicing their opinions online  Methods to extract insights from the large and complex collections of digital data openly available exist
  14. 14. possible applications  Impact of public policy programmes + projects  Large data sets, near real-time  Historical trending & sentiment  Inform research design  Supports field research, focus groups & interviews  Using alongside traditional lines of evidence  Establish baselines for analysis & future monitoring
  15. 15. opportunities: measurement & context  Measure the outcomes of public outreach, communication, advocacy, and information sharing programs  Create detailed stakeholder maps  Maps of social networks that capture relationships, provide insight into their nature, and identify unknown stakeholders and influencers  Provide context and insight to inform further data collection  E.g. Country profiles of internet/social media usage  Historical trends  Take a snapshot of these data at several points in time
  16. 16. opportunities: respondent groups  Traditional methodologies have a hard time reaching certain respondent groups  Useful in gathering unvarnished views from groups that have access/means to get online, but benefit from anonymity. Useful for:  Data collection in sensitive environments (e.g. post- conflict zones);  Obtaining views on issues people are quiet about in person (e.g. racism)  Gathering perceptions from beneficiaries adverse to authority (vulnerable and marginalized populations, criminal offenders, youth)
  17. 17. When there could have been Sentiment Analysis in Evaluation Arts Promotion Program aims to build stronger citizen engagement in communities through the performing and visual arts and in the expression, celebration and preservation of local historical heritage. “Limited evidence gathered to fully assess the ultimate outcomes of the program, stemming from … the fact that the evaluation team could not gather direct views from a representative sample of volunteers and the general public. “ Strengthen Civil Society Program’s goal is to promote resilient, healthy and just communities and support processes that strengthen civil society. “Determining the extent to which [the program] has increased the knowledge or actions taken by Canadians in food, environmental and biodiversity issues was not possible within the scope and budget of this evaluation in the absence of survey data or a baseline”.
  18. 18. When there could have been Sentiment Analysis in Evaluation Canadian Culture Program aims to develop Canadian writers, and to publish and disseminate their books effectively in Canada and abroad. The ultimate program outcome is “Increased access to a diverse range of Canadian-authored books in Canada and abroad”. “Certain program outcomes could not be directly measured during the evaluation…For example, in order to measure indicators such as “Increased Awareness” and “Increased Access”, the evaluation has had to use proxy indicators (e.g. sales) to infer awareness and access. Immigration/Settlement Program aims to contribute to improving labour market integration outcomes of foreign-trained individuals in targeted occupations and sectors. In most areas of anticipated outcomes, there was not a baseline measure of these outcomes at the point of implementation of the program. As a result, measurements of change or improvements rely on the recall and opinion of current respondents.
  19. 19. biases + challenges survey  Trying to locate users/beneficiaries  With small amounts of data, accuracy is key  Locating hard to reach populations  Sampling Biases (Whom to survey?)  Selection biases (people that respond to surveys) sentiment analysis  Production of intentionally misleading content (i.e. astro- turfing)  Overly aggressive behaviours (troll tendencies)  Difficulty in detecting sarcasm, irony  Selection biases (people that post opinions online)
  20. 20. Privacy and Ethics  All data collected are publicly accessible; no private data are accessed.  MediaBadger is in full compliance with Canada’s privacy law (PIPEDA) at all times.  MediaBadger is reviewing membership in industry associations that provide clear, ethical guidelines in line with our values.  Digital Media Analytics and Sentiment Analysis maintain the confidentiality of all respondents; responses are aggregated in ways similar to traditional lines of evidence (i.e. key informant interviews).
  21. 21.  Exploring opportunities for digital media analytics as a line of evidence in evaluation  Difficult when ToRs are established and budgets are fixed  Outreach to the designers of evaluations (many of you!)  Looking for opportunities to further the field of evaluation through this workBringing advanced cyber analytics to program evaluation. partnering
  22. 22.  Giles Crouch, CEO giles@mediabadger.com Tasha Truant, Consultant Manager ttruant@ggi.ca
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.