Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

From Russia with Love: Fox Sports World Cup Production (ARC333) - AWS re:Invent 2018

4,593 views

Published on

The 2018 FIFA World Cup was the biggest production in Fox Sports 20-year history in terms of personnel, hours, and scale. With 64 matches broadcast, it totalling more than four previous World Cups combined. To deliver this, they deployed a revolutionary joint solution from AWS, Aspera, and a portfolio of APN partners that enabled the live delivery of a broadcast video from Russia to Los Angeles for remote production. In this session, learn the architectural patterns of large-scale data and content movement to the cloud over long distances with low-latency requirements, the high-availability requirements around broadcast workloads, and tradeoffs to consider. We cover how Amazon S3, Amazon EC2, AWS Lambda, Amazon SQS, and other services processed over 1.9 PB of original content over the 30-day tournament.

  • Hurry up, Live Webinar starts in 6 minute! it's about the FREE Training Webinar: An insider system that made $23,481 in last 6 weeks! ▲▲▲ http://ishbv.com/zcodesys/pdf
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here

From Russia with Love: Fox Sports World Cup Production (ARC333) - AWS re:Invent 2018

  1. 1. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. From Russia with Love: Fox Sports World Cup Production A R C 3 3 3 Brandon Potter Director, Post Production Fox Sports Mike Flathers CTO IBM Aspera Bhavik Vyas Global Tech Partnerships AWS, M&E
  2. 2. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Los Angeles What we’re going to talk about
  3. 3. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Three aspects to this story Video transport and processing AWS infrastructure Live event production
  4. 4. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. #1 Fox Sports Video Transport and Processing AWS Infrastructure Live event production
  5. 5. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved.
  6. 6. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. FIFA World Cup production overview ~2PB live transcode and written to Amazon S3 <10 seconds delay from Moscow to LA for live edit 1-3% packet loss over WAN (Moscow-LA) 64 matches in 12 venues spread across 1800 miles >150TB in a single day total 2160p50 UHD-HDR records64 Los Angeles Moscow
  7. 7. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Business challenges Security Content Management Time “We’re doing it live!” Secure Ingest 22,000 assets Multi-site productionDistance Small Footprint Temp location
  8. 8. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Event production overview Moscow AWS PDX Host Broadcast Services SDI camera feeds StorageFile-based content Los Angeles Web users EditorsDesktop EVS 1080p AVCi 220Mbps HLS HLS Stream 1080p AVCi 220Mbps Aspera FASPstream
  9. 9. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Fox Sports Charlotte “Little Russia”
  10. 10. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Russia ingest room
  11. 11. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Baseband feed monitoring
  12. 12. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Telestream scheduling
  13. 13. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Media operator stations
  14. 14. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Production Asset Management: Reach Engine
  15. 15. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. #2 The AWS infrastructure and processing Video Transport and Processing AWS infrastructure Live Event Production
  16. 16. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Recap of event stats • June 14 –July 15, 2018 • 64 games • 64 UHD-HDR full game records • <10 seconds • Delay from Moscow to LA for live edit • ~2PB & ~ 22K assets • Live transcode written to Amazon S3 Los Angeles Moscow
  17. 17. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Recap of business requirements • Receive 64 match feeds in Moscow • Record 64 match feeds in Los Angeles • Archive everything • Virus scan everything • Support x3 “Edit @Home” workflows: • Live edit in <30s • Live sub-clip • Post game edit Los Angeles Moscow
  18. 18. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Horizontal and vertical solution matrix Moscow Video transport and processing | LA & Russia & AWS Transport | IBM Aspera Transcode/Package | Telestream Transcode/Package | Vantage Cloud (Telestream) AWS Amazon EC2 Amazon S3 AWS Direct Connect Amazon Lambda, SQS Content production | LA on-site & Russia Editing | Adobe Premier PAM | Levels Beyond Storage | Harmonic Media Grid Storage | Quantel Los Angeles
  19. 19. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Let’s look at the WAN topology Los Angeles Moscow Host Broadcast Services
  20. 20. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Los Angeles Moscow Host Broadcast Services Let’s look at the end-to-end video path • Live edit in <30s • Live sub-clip • Post game edit PDX
  21. 21. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Los Angeles Frankfurt Moscow PDX WAN: Russia to US FRA 1x 10Gbps AWS backbone 2x 10Gbps Redundant Public IP Oregon
  22. 22. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Host Broadcast Services Frankfurt LV POP Oregon
  23. 23. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Host Broadcast Services PDX Endpoints VPC Prod VPC Dev Prod Dev VPC Prod VPC Prod VPCs to support video processing and storage VPC Prod Oregon
  24. 24. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Security: AV scanning file-based content • All file-based required an AV scan • Why not on prem? • Limited duration event (four weeks) • Limited onsite capacity • Needed to “Air gap” on-premise facilities • Solution • Quarantine using Amazon S3 and AWS VPCs • Automate AV workflows workflows
  25. 25. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Processing workflows to AV scan 22K assets New Asset Queue VPC prod PDX AV Pass Auto Scaling group – AV scanning Controlled by Asset Queue VPC Prod New Asset Prod S3 notification Write to S3 Triggers read from S3 AV scan send results to Lambda Process results and updates 2nd SQS queue with passed assets VPC Prod Registers asset as prod ready Listens to AV Pass Queue Oregon
  26. 26. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. #3 Video transport and processing Video transport and processing AWS Infrastructure Live Event Production
  27. 27. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved.
  28. 28. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Recap of business challenges Security Content Management Time and distance “We’re doing live multi-site production!” secure ingest 22,000 assets
  29. 29. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Challenge - Time and distance Obtainable throughput with TCP Bits-per-seconds-throughput = TCP-Window-Size-in-bits / Latency-in-seconds • Consider 200ms of latency, 2% packet loss, 64k TCP window size • Maximum throughput per session = 65535 * 8 / 0.2 = 2621400 bps = 2.62 Mbps Distance =  Packet Loss,  Latency What can be done? • Minimize startup time • Minimize bandwidth overhead • Reduce the number of retransmissions
  30. 30. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Time and distance optimization by the numbers • Optimized pipeline • Direct to S3 (in memory) • Direct server-to-server transfer • Direct API integration • WAN transfer speed helped • Instance type - c3.8xlarge • 2.1Gb/s per stream • 3.5Gb/s total (five sessions, one machine) • But more was needed • Dynamic scale out (threeserversinMoscow) • 9.5Gb/s of sustained throughput
  31. 31. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Challenge - Security Secure Ingest • Session authentication • SSH 3DES • Transfer authorization • Tokens • Access Keys / Secret • Data encryption and integrity verification • AES (128, 192, 256) • MD5 Hash applied to each datagram before transmission and verified on the receiver
  32. 32. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Challenge - Content management 22,000 assets • API integration with Reach Engine • Server to server transfers • Multiple (three) Aspera servers in Moscow • Auto-scale cluster in AWS (US West-2 Oregon) • SAN storage in LA for live full-res editing Los Angeles Moscow Oregon
  33. 33. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Key workflows - Growing records • Optimize, optimize, optimize • Provide in-application feedback • Expect the unexpected • Avoid anything bespoke Key design principles
  34. 34. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Key workflows - Growing records Live sub-clip workflow 5Mbps Proxy Moscow-S3 (US West-2 Oregon)
  35. 35. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved.
  36. 36. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Lessons learned • Simulation is key • Collaborate—avoid walls between vendors • Teamwork / Solution(s) • Track progress relentlessly
  37. 37. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Lessons learned • S3 HLS origin over large geographical areas • Introduced latency in Russia • Implement a CDN • Building the right team • Paramount to success and overcoming challenges • Slack
  38. 38. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Lessons learned • S3 HLS origin over large geographical areas • Introduced latency in Russia • Implement a CDN • Building the right team • Paramount to success and overcoming challenges • Slack • …and hot sauce
  39. 39. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Looking forward and what’s next • Women’s World Cup • Explore virtualized workflows • Machine Learning • Uncompressing FASP streams on the destination • Tighter API integrations
  40. 40. Thank you! © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved. Brandon Potter Director, Post Production Fox Sports Mike Flathers CTO IBM Aspera Bhavik Vyas Global Tech Partnerships AWS, M&E
  41. 41. © 2018, Amazon Web Services, Inc. or its affiliates. All rights reserved.

×