Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

What makes good video? Using data to do better with our content.

210 views

Published on

Anna Chiaretta Lavatelli, Trilce Navarrete, Elena Villaespesa, Emily Robbins

The Media Production and Branding SIG and the Data and Insights SIG have combined efforts to develop an online survey to gain an understanding of the scope of production and goals of production in museums today. In this talk we will present an analysis of the collected data from the survey to gain understanding of the state of video production in museums.

Published in: Art & Photos
  • Be the first to comment

  • Be the first to like this

What makes good video? Using data to do better with our content.

  1. 1. What is Good Video? Using data to do better with our content Anna Chiaretta Lavatelli, Director of Digital Media, Museum of Contemporary Art Chicago (@annachiaretta) Trilce Navarrete, Erasmus University Rotterdam (@trilcenavarrete) Emily Robbins, Web and Digital Assistant, San Francisco Museum of Modern Art (@EmBRobbins) Elena Villaespesa, Digital Analyst, The Metropolitan Museum of Art (@elenustika)
  2. 2. Why Video Analysis? Social media use in museums is growing to connect to the younger audiences, particularly to attract new audiences. Performance indicators serve as metric to compare with peers. There is no the performance of an institution: metrics make sense when placed in context. Yet we lack a harmonized method for measuring success, and for comparing institutional performance across the sector. We invited the MCN community to participate in a survey to help us identify the current practice which led to this presentation. Here the results of the survey: … but first a world view of the use of social media.
  3. 3. Increase use of video Internet video traffic accounts for +70% of all content traffic, which will grow x4 from 2015 to 2020 (CISCO). Google = #1, YouTube = #2, Facebook = #3, ... Wikipedia = #6 …. (Alexa) YouTube = +1 billion unique visitor per month, 1 billion video views per day Facebook = +1 billion users, 8 billion video views per day Wikipedia = +15 million visitors per month Video = information, communication and entertainment. What do MCN museums do with video?
  4. 4. Survey methodology • Research objectives • Understand video content production strategies • Get an overview of the resources dedicated to video production • Identify the channels used for video distribution • List the main challenges to evaluate video content • 26 responses • URL distributed among MCN members
  5. 5. Q: Does your museum produce video content? Video production: 1-100 videos per year
  6. 6. Q: What production positions exist at your institution?
  7. 7. Q: What types of media products does your museum produce?
  8. 8. Q: When you have a new video to upload...
  9. 9. Q: How do you currently distribute video?
  10. 10. Q: What tools do you use to evaluate your videos performance?
  11. 11. Q: What metrics do you report to your team and senior management?
  12. 12. Q: What are your challenges when evaluating video content?
  13. 13. What does this all mean? From the 26 MCN institutions that responded to the survey we can say: • Museums use major social media as channels for video distribution • #1 = YouTube, #2 = Facebook • # 3 = Vimeo + Instagram, #4 = Twitter • There is interest in knowing more about the impact of video production • # views is most used metric • Main challenges: • Limited know-how / resources to support video production / evaluation • Lack of an institutional strategy to guide analysis • We need a 'how to' guide for video metrics
  14. 14. Quick guide on how to develop your own video production evaluation metrics  Strategy of desired goal: What do you want to achieve with video? ○Qualitative / Quantitative  Start point to compare after change in practice: Where are you at? ○Reach ○Engagement ○Impact  Some metrics gathered regularly: How are things changing? ○Quantitative / Qualitative  Reflect on your progress: What does this mean? Eventually, make up your own indicators (e.g. Key Intangible Performance Indicators)
  15. 15. Q: What metrics do you report to your team and senior management? Reach Engagement ImpactUser surveys to see that strategic goal is being met Reach Engagement Reach Engagement Impact
  16. 16. Video evaluation framework
  17. 17. Draft benchmark on video production  Strategy of desired goal: What do you want to achieve with video? ○Qualitative / Quantitative  Start point to compare after change in practice: Where are you at? ○Reach ○Engagement ○Impact  Some metrics gathered regularly: How are things changing? ○Quantitative / Qualitative  Reflect on your progress: What does this mean? Eventually, make up your own indicators (e.g. Key Intangible Performance Indicators)
  18. 18. Example - The Met (Elena)
  19. 19. Benchmarking on YouTube
  20. 20. Get the YouTube Channel ID 1.Developer tools 2.Select channel owner name 3.Copy code from data-ytid field
  21. 21. Get the YouTube Data YouTube data scraping tool from Digital Methods: https://tools.digitalmethods.net/netvizz/youtube/mod_videos_list.php
  22. 22. Selecting Benchmark Data WHO? 1. Peer Museums similar budget, subject and scale of program 2. Aspirational Museum/Content Producer
  23. 23. Learnings Good practices • Frequency of posts • Short duration • There are exceptions to every rule Easy wins if popularity is your goal • Celebrities
  24. 24. Example Goal: more views and followers Production goals to date • Capture program content • Create better produced content (edited narratives) New practices • Post video clips (outtakes) • Timely release of content • Prototype informal content capture
  25. 25. Case Study: Video Production and Analysis at SFMOMA @EmBRobbins
  26. 26. Background Two teams produce three streams of video Initial strategies were developed in preparation for big opening moment Now in a moment of reflecting back Web and Digital Content: Institutional Storytelling Interpretive Media: Artist Interviews
  27. 27. Artist Interviews 3-5 minute interviews with artists about their practice or an aspect of their work Goals • Provide insightful content to current web/museum audiences • Create a long-lasting public resource • Depth of impact > breadth of reach Priorities • Artists in the collection • Older artists are prioritized • Relevance of production to on-site programming • Strength of video material
  28. 28. Analytical Process • Alternation between production and analysis phases • Analytics findings • Video was more popular than other types of content • “Cinematic” videos were more viewed (compared to static talking-head interviews) • Videos published to coincide with current events got more views • Next steps • Measure depth of impact using quantitative research
  29. 29. Institutional Storytelling 3-5 minute videos telling stories about artists and museum workers, working on installations, projects, programs and performances Goals • Humanize the museum, artists and artworks • Increase depth of interest in the museum and its offerings through storytelling • Provide information about artists, artworks, exhibitions, and performances • Attract visitors to the museum/promote specific activities Priorities • Alignment with institutional priorities/messaging • Strength of story/visually compelling elements • Budget
  30. 30. Analytical Approach • Entering analytics phase after a period of production • Initial findings • Some platforms generate more views than others • Location on website affects viewership • Videos about iconic aspects of SFMOMA received more views • Next Steps • Strategize about publishing platforms • Analyze website pathways to improve video placement on website • Do surveys with niche audiences (like teachers) to determine impact
  31. 31. Different approaches for different goals • View counts have different implications depending on video goals • Qualitative analysis needed to measure “impact” • For some videos views are less important than who watches and what they get out of it • Evaluate platform, not just content
  32. 32. Conclusions
  33. 33. Develop Your Own Video Production Evaluation Metrics ● Strategy of desired goal: What do you want to achieve with video? ○ Qualitative ○ Quantitative ● Start point to compare output after change in practice: Where are you at? ○ Reach ○ Engagement ○ Impact ● Some metrics gathered regularly: How are things changing? ○ Quantitative ○ Qualitative ● Reflect on your progress: What does this mean?
  34. 34. Next Steps/Goals Outcomes from survey and evaluation • Iterating on our data capture • Create peer groups for benchmarking • Continue building evaluation tools MCN community survey version 2 • Additional questions based on the results of the first survey and responses at the presentation • The new survey: http://bit.ly/2ezBl5B
  35. 35. Best Practices IDENTIFY GOALS: define what impact means for you PROTOTYPE: create content and distribute in different ways, analyze results FRAMEWORK: Select evaluation methods BENCHMARK: Identify peers and aspirational content producers ITERATE
  36. 36. Thank you Anna Chiaretta Lavatelli, Director of Digital Media, Museum of Contemporary Art Chicago (@annachiaretta) Trilce Navarrete, Erasmus University Rotterdam (@trilcenavarrete) Emily Robbins, Web and Digital Assistant, San Francisco Museum of Modern Art (@EmBRobbins) Elena Villaespesa, Digital Analyst, The Metropolitan Museum of Art (@elenustika)

×