Anna Chiaretta Lavatelli, Trilce Navarrete, Elena Villaespesa, Emily Robbins
The Media Production and Branding SIG and the Data and Insights SIG have combined efforts to develop an online survey to gain an understanding of the scope of production and goals of production in museums today. In this talk we will present an analysis of the collected data from the survey to gain understanding of the state of video production in museums.
FULL NIGHT — 9999894380 Call Girls In Ashok Vihar | Delhi
What makes good video? Using data to do better with our content.
1. What is Good Video?
Using data to do better with our content
Anna Chiaretta Lavatelli, Director of Digital Media, Museum of Contemporary Art Chicago
(@annachiaretta)
Trilce Navarrete, Erasmus University Rotterdam (@trilcenavarrete)
Emily Robbins, Web and Digital Assistant, San Francisco Museum of Modern Art (@EmBRobbins)
Elena Villaespesa, Digital Analyst, The Metropolitan Museum of Art (@elenustika)
2. Why Video Analysis?
Social media use in museums is growing to connect to the younger
audiences, particularly to attract new audiences.
Performance indicators serve as metric to compare with peers. There is no
the performance of an institution: metrics make sense when placed in
context.
Yet we lack a harmonized method for measuring success, and for comparing
institutional performance across the sector.
We invited the MCN community to participate in a survey to help us identify
the current practice which led to this presentation.
Here the results of the survey: … but first a world view of the use of social
media.
3.
4. Increase use of video
Internet video traffic accounts for +70% of all content traffic, which will grow
x4 from 2015 to 2020 (CISCO).
Google = #1, YouTube = #2, Facebook = #3, ... Wikipedia = #6 …. (Alexa)
YouTube = +1 billion unique visitor per month, 1 billion video views per day
Facebook = +1 billion users, 8 billion video views per day
Wikipedia = +15 million visitors per month
Video = information, communication and entertainment.
What do MCN museums do with video?
5. Survey methodology
• Research objectives
• Understand video content production strategies
• Get an overview of the resources dedicated to video production
• Identify the channels used for video distribution
• List the main challenges to evaluate video content
• 26 responses
• URL distributed among MCN members
6. Q: Does your museum produce video content?
Video production: 1-100 videos per year
11. Q: What tools do you use to evaluate your videos
performance?
12. Q: What metrics do you report to your team and
senior management?
13. Q: What are your challenges when evaluating
video content?
14. What does this all mean?
From the 26 MCN institutions that responded to the survey we can say:
• Museums use major social media as channels for video distribution
• #1 = YouTube, #2 = Facebook
• # 3 = Vimeo + Instagram, #4 = Twitter
• There is interest in knowing more about the impact of video production
• # views is most used metric
• Main challenges:
• Limited know-how / resources to support video production / evaluation
• Lack of an institutional strategy to guide analysis
• We need a 'how to' guide for video metrics
15.
16. Quick guide on how to develop your own video
production evaluation metrics
Strategy of desired goal: What do you want to achieve with video?
○Qualitative / Quantitative
Start point to compare after change in practice: Where are you at?
○Reach
○Engagement
○Impact
Some metrics gathered regularly: How are things changing?
○Quantitative / Qualitative
Reflect on your progress: What does this mean?
Eventually, make up your own indicators (e.g. Key Intangible Performance Indicators)
17. Q: What metrics do you report to your team and
senior management?
Reach
Engagement
ImpactUser surveys to see that strategic goal is being met
Reach
Engagement
Reach
Engagement
Impact
19. Draft benchmark on video production
Strategy of desired goal: What do you want to achieve with video?
○Qualitative / Quantitative
Start point to compare after change in practice: Where are you at?
○Reach
○Engagement
○Impact
Some metrics gathered regularly: How are things changing?
○Quantitative / Qualitative
Reflect on your progress: What does this mean?
Eventually, make up your own indicators (e.g. Key Intangible Performance Indicators)
22. Get the YouTube Channel ID
1.Developer tools
2.Select channel owner name
3.Copy code from data-ytid field
23. Get the YouTube Data
YouTube data scraping tool from Digital Methods:
https://tools.digitalmethods.net/netvizz/youtube/mod_videos_list.php
24. Selecting Benchmark Data
WHO?
1. Peer Museums
similar budget, subject and scale of program
2. Aspirational Museum/Content Producer
25.
26.
27.
28.
29. Learnings
Good practices
• Frequency of posts
• Short duration
• There are exceptions to every rule
Easy wins if popularity is your goal
• Celebrities
30. Example
Goal: more views and followers
Production goals to date
• Capture program content
• Create better produced content (edited narratives)
New practices
• Post video clips (outtakes)
• Timely release of content
• Prototype informal content capture
32. Background
Two teams produce three streams of video
Initial strategies were developed in preparation for big opening moment
Now in a moment of reflecting back
Web and Digital Content:
Institutional Storytelling
Interpretive Media:
Artist Interviews
33. Artist Interviews
3-5 minute interviews with artists about their practice or an aspect of their
work
Goals
• Provide insightful content to current web/museum audiences
• Create a long-lasting public resource
• Depth of impact > breadth of reach
Priorities
• Artists in the collection
• Older artists are prioritized
• Relevance of production to on-site programming
• Strength of video material
34. Analytical Process
• Alternation between production and analysis phases
• Analytics findings
• Video was more popular than other types of content
• “Cinematic” videos were more viewed (compared to static talking-head interviews)
• Videos published to coincide with current events got more views
• Next steps
• Measure depth of impact using quantitative research
35. Institutional Storytelling
3-5 minute videos telling stories about artists and museum workers, working
on installations, projects, programs and performances
Goals
• Humanize the museum, artists and artworks
• Increase depth of interest in the museum and its offerings through storytelling
• Provide information about artists, artworks, exhibitions, and performances
• Attract visitors to the museum/promote specific activities
Priorities
• Alignment with institutional priorities/messaging
• Strength of story/visually compelling elements
• Budget
36. Analytical Approach
• Entering analytics phase after a period of production
• Initial findings
• Some platforms generate more views than others
• Location on website affects viewership
• Videos about iconic aspects of SFMOMA received more views
• Next Steps
• Strategize about publishing platforms
• Analyze website pathways to improve video placement on website
• Do surveys with niche audiences (like teachers) to determine impact
37. Different approaches for different goals
• View counts have different implications depending on video goals
• Qualitative analysis needed to measure “impact”
• For some videos views are less important than who watches and what they get out
of it
• Evaluate platform, not just content
39. Develop Your Own Video Production
Evaluation Metrics
● Strategy of desired goal: What do you want to achieve with video?
○ Qualitative
○ Quantitative
● Start point to compare output after change in practice: Where are you at?
○ Reach
○ Engagement
○ Impact
● Some metrics gathered regularly: How are things changing?
○ Quantitative
○ Qualitative
● Reflect on your progress: What does this mean?
40. Next Steps/Goals
Outcomes from survey and evaluation
• Iterating on our data capture
• Create peer groups for benchmarking
• Continue building evaluation tools
MCN community survey version 2
• Additional questions based on the results of the first survey and
responses at the presentation
• The new survey: http://bit.ly/2ezBl5B
41. Best Practices
IDENTIFY GOALS: define what impact means for you
PROTOTYPE: create content and distribute in different ways, analyze results
FRAMEWORK: Select evaluation methods
BENCHMARK: Identify peers and aspirational content producers
ITERATE
42. Thank you
Anna Chiaretta Lavatelli, Director of Digital Media, Museum of Contemporary Art Chicago
(@annachiaretta)
Trilce Navarrete, Erasmus University Rotterdam (@trilcenavarrete)
Emily Robbins, Web and Digital Assistant, San Francisco Museum of Modern Art (@EmBRobbins)
Elena Villaespesa, Digital Analyst, The Metropolitan Museum of Art (@elenustika)
Editor's Notes
The museum community has been engaged in the production of moving image from the earliest days of documenting exhibitions, events and processes in celluloid film. The advancement of video technology over the past 50 years, especially the launch of YouTube 10 years ago, has resulted in exponential growth in moving image content production.
Creating and distributing video is a standard part of museum practice, but without many standards for producing and posting that content. Our methods have been limited by the capacity of teams or individuals to produce, and our anecdotal understanding of what people like and how they like to view it. Fortunately online distribution provides extensive information about what and how video is viewed, but this information is typically underutilized. The Media Production and Branding SIG and the Data and Insights SIG have combined efforts to develop an online survey to gain an understanding of the scope of production and goals of production in museums today. In this talk we will present an analysis of the collected data from the survey to gain understanding of the state of video production in museums. We will discuss the learnings from the data collected and discuss how museums can gain insights into the performance of their content. We hope that this will provide meaningful approaches to analyzing video production methods and distribution tactics in one’s own institution, based on data evaluation, to support content strategy at the management level. This presentation is part of a series of benchmarking tools being developed in a collaborative effort between the two MCN Special Interest Groups. Participants will receive a preliminary step-by-step tool to collect and make sense of data from their online distribution platforms.
The goals of this panel: for the audience it will be good to take some specific ideas that can be quickly applied and deliver results AND to learn about the process experience in measuring video activities. Goals for us is to share/get feedback AND to have additional data in our survey.
Why video analysis? Video trends
We see an increase in use of video. Museums are also adopting the technology for a number of specific users (often different than other main video contributors).
We still lack a harmonized method for measuring success. However: how do you measure success?
Metrics are numbers that try to capture the performance of institutions. Some Measures (such as performance indicators) serve as evidence of progress while others (performance measures) are defined depending on the need (e.g. efficiency may mean small pixel images and less resources to achieve larger quantity even if less quality or efficiency may mean larger format and more resources to achieve quality even in less quantity).
There is no such a thing as ‘the performance’ of an institution, metrics make sense when compared: to other institutions / other times. The challenge is to ensure homogeneity in data collection and analysis to ensure relevant comparability.
Number of followers in social media, selected institutions from EU and the US
(data gathered in Sept-Oct 2016)
Use of Wikimedia video content to illustrate Wikipedia articles continues to grow
1/26 institution (3.8%) does not evaluate video production at all
Nice to inventory current practice but we should do it for the public: what is the public doing? Who is the public? … can we get data on our MCN community for next time?
Strategy of desired goal: What do you want to achieve with video?
Qualitative / Quantitative
Start point to compare output after change in practice: Where are you at?
Reach
Engagement
Impact
Some metrics gathered regularly: How are things changing?
Quantitative / Qualitative
Reflect on your progress: What does this mean?
Eventually, make up your own indicators (e.g. Key Intangible Performance Indicators)
In order to evaluate the engagement with video content the museum should use a combination of quantitative and qualitative metrics to measure the reach (volume of people that view the content and characteristics of this traffic), engagement/attention (how long people spend watching the video) and impact (how the video content has been used to have an impact on the user). The table on this slide shows some example of the metrics and methods that could be used.
Reach. One of the main objectives of museums is reaching out to all audiences. As the survey results show, the majority of museums post their video content on social media and third party platforms, sites where audiences already are. Therefore, it is important to understand the reach of this content with metrics like video views. We can target specific users on each of the platforms so it is key to analyze the profile of this audience. How people get to this content (eg. video embeded on a blog, museum’s Facebook page, museum’s website... ) will help the museum to evaluate the success of the distribution and reach efforts on each of the channels but also help to understand the potential impact in the redistribution by users who also share this video content on their own sites or social media accounts. The latest will give the museum insights on the most successful type of videos valued by the users. The majority of the platforms have an analytics tool that provides data about the video views and other reach metrics. In some platforms, stats are more detailed than in others.
Engagement. Besides the number of plays which can give us a sense of potential users reach, museums need to understand how users engage with the video content. Based on the results, half of the museums surveyed are using these metrics to evaluate their content. This set of metrics include for example: % of the video watched, average video duration… but also can measure the engagement with the related content of the video, clicks on links or people consuming other content on that page. These metrics will evaluate the effectiveness of the video retention, how attractive this is for the user. Again, for this metrics many of the analytics tools offer very detailed information about the user experience interacting with the video. For instance, YouTube analytics offers a graph to see where the users stopped watching a video, Facebook Insights provides total video views but also views of at least 10 seconds or 30 seconds of the video.
Impact. As the survey results show, museums produce video content around exhibitions to promote them or record lectures and talks for researchers or learners. Ultimately the video aims to increase awareness of the museum, sell exhibition tickets or increase the users’ knowledge of the museum subject among other objectives. Museums can use analytics tools to capture the impact of the advocacy of users with shares of the content or tracking with web analytics tools how many people came from these channels, ads or from watching videos on the website to purchase a ticket. In order to capture some of the impact metrics the museum will need to implement surveys at the museum or use other methods to gather direct feedback from a sample of users.
This evaluation framework can serve as a starting point for museums to capture video metrics related to the main objectives of their video production activity.
What is good video? Using data to do better with our content
Media Production and Branding SIG and the Data and Insights SIG of MCN collaboration
The Met has published a series of 360 degrees videos on Facebook including shots of the Temple of Dendur, the Great Hall among other iconic spaces of the museum and The Met Breuer, the new location that opened in March this year.
http://www.metmuseum.org/blogs/digital-underground/2016/facebook-360-temple-of-dendur
The success of this series of videos has been great not only in reaching audiences but also in the conversations users engaged with.
For these videos we looked at the reach (views and people reached) but also at how many people have helped to increase the virality of these videos. The majority of the views came from people that do not follow The Met on Facebook. The impact of the shares really amplify the reach of this video content.
Comments reach the thousands on each video and this was for the museum one of the key success elements of this content. Comments varied for each video. For example comments for the Temple of Dendur video included:
People that love this space and commented on the sentiments they have when they visit this part of the museum
People in other countries who valued having this immersive experience on their phones or computers to explore the museum they visited years ago or that have never visited. This video allowed people to discover this space, some did not know this amazing art piece was at the museum.
People who love technology who were very interested about the process of recording these videos and shared the videos with other “tech nerds” and commented how amazing they were
Anna’s mapping of peer channel data to try to understand methods for content production
Views v. Duration
Views v. Publish frequency
Anna’s mapping of peer channel data to try to understand methods for content production
Views v. Duration
Views v. Publish frequency
who has a channel you’d like to be more like, even if just one aspect (upload frequency, style, duration, etc.)
Anna’s mapping of peer channel data to gain insights on achieving more views
Views v. Duration
Views v. Publish frequency
Averages
Trends
Anna’s mapping of peer channel data to gain insights on achieving more views
Views v. Duration
Views v. Publish frequency
Averages
Trends
Likes over time
Frequency of publishing
regularity and continued commitment
short is better from what we know but it doesn’t stop people from viewing (Walker), great content is great content
Production goals:
Capture and distribute as much program content is possible for our archive (and future scholarship)
Create better produced content to be a viewer’s trusted resource for good video (demonstrated by views and shares)
New practices:
Post video clips (outtakes) more regularly between productions
Prototype informal content capture for sustainable increase in production
More upfront work to ensure timely release of content
Select evaluation method and indicators: what are you wanting to measure, and what is the best metric to inform that
Metrics are imperfect in capturing complex goals such as improving engagement/ education/ impact of museum work, but they can serve as indicator for benchmarking - with own institution in past, present and future, and with other institutions. Analyse performance and adjust approach to production/distribution and the evaluation method itself.