Netflix has developed a new global subtitles workflow to process timed text from over 20 languages. The new workflow uses TTML2 as the canonical format and includes configurable inspections and conversions. Netflix is actively involved in standards through the W3C and supports IMSC1 and TTML2. They are working on open source tools for IMF and timed text to help standardization efforts.
This document provides an overview of plant responses to internal and external signals. It discusses how plant hormones help coordinate growth, development, and responses to stimuli. The major classes of plant hormones are described, including auxin, cytokinins, gibberellins, brassinosteroids, abscisic acid, and ethylene. The roles of these hormones in processes like cell elongation, fruit growth, seed dormancy, and drought tolerance are summarized. The document also covers how plants respond to light via photoreceptors, and the importance of light signals for plant photomorphogenesis, phototropism, and other responses.
Macroscope - démystification et application pratiqueMacroscope®
Présenté dans le cadre de l'atelier "Démystification et application pratique de Macroscope" organisé par l'AQIII (www.aqiii.org) le 9 janvier 2014. Thèmes:
- Nature et portée de Macroscope comme outil de référence et de productivité
- Comment il est facile de l'adapter et de le personnaliser selon vos besoins
- Son accessibilité pour vous en tant que travailleur autonome.
NOTE: les hyperliens inclus dans la présentation originale ne fonctionnent pas en mode SlideShare.
ThingWorx Connectors - How to Make Different Systems "Speak the Same Language"Joseph Lopez, M.ISM
Peter Pfalzgraf, Head of Business Units for PROSTEP AG presents how PROSTEP Solutions Integrate with ThingWorx Connectors - Enterprise IoT Solutions and Platform Technology
Igalia is working on enabling Chromium to run natively on Wayland-based Linux systems by improving the Ozone/Wayland implementation. There are ongoing discussions around implementing external window mode in Ozone to allow Chromium windows to be separate from the host desktop, as well as splitting the UI and GPU components into separate processes as in ChromeOS. Igalia is focused on changing the Mus demo to support external window mode and getting Chromium to launch in this mode on Linux. Further work is also needed on desktop integration and using Mojo for inter-process communication in Ozone/Wayland.
This document discusses frame-accurate seeking in video files, which can be difficult due to non-indexed files, corrupt files, virtual timelines, invisible frames, and unstable timestamps. It introduces FFMS2 as a solution that indexes files and builds packet-to-frame maps, handling these complexities. FFMS2 provides a stable, mature API for frame-accurate seeking that is easy to use, fully documented, and has a stable ABI. It supports features like mid-stream changes and HDR metadata. Future plans for FFMS2 include custom I/O callbacks and virtual timeline metadata export.
Wwx2014 - Todd Kulick "Shipping One Million Lines of Haxe to (Over) One Milli...antopensource
Haxe has allowed TiVo to move away from Flash and Adobe dependencies while reusing over 900,000 lines of existing ActionScript code. TiVo converted their large codebase to Haxe over the course of a year and now deploys to set-top boxes and other devices using Haxe backends like NME and OpenFL. This has improved performance and allowed multi-platform development. Challenges included IDE support issues and application requirements, but contributions from TiVo helped improve Haxe tools and libraries. Going forward, TiVo plans to deploy to more platforms using Haxe and continue optimizing the compiler and runtime.
This document provides an overview of plant responses to internal and external signals. It discusses how plant hormones help coordinate growth, development, and responses to stimuli. The major classes of plant hormones are described, including auxin, cytokinins, gibberellins, brassinosteroids, abscisic acid, and ethylene. The roles of these hormones in processes like cell elongation, fruit growth, seed dormancy, and drought tolerance are summarized. The document also covers how plants respond to light via photoreceptors, and the importance of light signals for plant photomorphogenesis, phototropism, and other responses.
Macroscope - démystification et application pratiqueMacroscope®
Présenté dans le cadre de l'atelier "Démystification et application pratique de Macroscope" organisé par l'AQIII (www.aqiii.org) le 9 janvier 2014. Thèmes:
- Nature et portée de Macroscope comme outil de référence et de productivité
- Comment il est facile de l'adapter et de le personnaliser selon vos besoins
- Son accessibilité pour vous en tant que travailleur autonome.
NOTE: les hyperliens inclus dans la présentation originale ne fonctionnent pas en mode SlideShare.
ThingWorx Connectors - How to Make Different Systems "Speak the Same Language"Joseph Lopez, M.ISM
Peter Pfalzgraf, Head of Business Units for PROSTEP AG presents how PROSTEP Solutions Integrate with ThingWorx Connectors - Enterprise IoT Solutions and Platform Technology
Igalia is working on enabling Chromium to run natively on Wayland-based Linux systems by improving the Ozone/Wayland implementation. There are ongoing discussions around implementing external window mode in Ozone to allow Chromium windows to be separate from the host desktop, as well as splitting the UI and GPU components into separate processes as in ChromeOS. Igalia is focused on changing the Mus demo to support external window mode and getting Chromium to launch in this mode on Linux. Further work is also needed on desktop integration and using Mojo for inter-process communication in Ozone/Wayland.
This document discusses frame-accurate seeking in video files, which can be difficult due to non-indexed files, corrupt files, virtual timelines, invisible frames, and unstable timestamps. It introduces FFMS2 as a solution that indexes files and builds packet-to-frame maps, handling these complexities. FFMS2 provides a stable, mature API for frame-accurate seeking that is easy to use, fully documented, and has a stable ABI. It supports features like mid-stream changes and HDR metadata. Future plans for FFMS2 include custom I/O callbacks and virtual timeline metadata export.
Wwx2014 - Todd Kulick "Shipping One Million Lines of Haxe to (Over) One Milli...antopensource
Haxe has allowed TiVo to move away from Flash and Adobe dependencies while reusing over 900,000 lines of existing ActionScript code. TiVo converted their large codebase to Haxe over the course of a year and now deploys to set-top boxes and other devices using Haxe backends like NME and OpenFL. This has improved performance and allowed multi-platform development. Challenges included IDE support issues and application requirements, but contributions from TiVo helped improve Haxe tools and libraries. Going forward, TiVo plans to deploy to more platforms using Haxe and continue optimizing the compiler and runtime.
The slides for the "Fuzzing Janus for fun and profit" paper I presented at IPTComm 2019, in Chicago. Simon (Romano) came up with the title, as a homage to the famous "Smashing the stack for fun and profit" article.
Capital One Delivers Risk Insights in Real Time with Stream Processingconfluent
Speakers: Ravi Dubey, Senior Manager, Software Engineering, Capital One + Jeff Sharpe, Software Engineer, Capital One
Capital One supports interactions with real-time streaming transactional data using Apache Kafka®. Kafka helps deliver information to internal operation teams and bank tellers to assist with assessing risk and protect customers in a myriad of ways.
Inside the bank, Kafka allows Capital One to build a real-time system that takes advantage of modern data and cloud technologies without exposing customers to unnecessary data breaches, or violating privacy regulations. These examples demonstrate how a streaming platform enables Capital One to act on their visions faster and in a more scalable way through the Kafka solution, helping establish Capital One as an innovator in the banking space.
Join us for this online talk on lessons learned, best practices and technical patterns of Capital One’s deployment of Apache Kafka.
-Find out how Kafka delivers on a 5-second service-level agreement (SLA) for inside branch tellers.
-Learn how to combine and host data in-memory and prevent personally identifiable information (PII) violations of in-flight transactions.
-Understand how Capital One manages Kafka Docker containers using Kubernetes.
Watch the recording: https://videos.confluent.io/watch/6e6ukQNnmASwkf9Gkdhh69?.
WebRTC Standards & Implementation Q&A - The Internals of WebRTC Browsers Impl...Amir Zmora
This document summarizes a Q&A session on WebRTC standards and implementation with Amir Zmora and Dan Burnett. Some of the key topics discussed include:
- Differences between how Chrome and other browsers implement WebRTC compared to the libwebrtc C++ library, especially around capabilities like screen sharing.
- The layered architecture of how WebRTC is implemented in different browsers, with Chrome having additional layers for security and capabilities compared to the libwebrtc implementation.
- The classes and namespaces used to design the libwebrtc C++ library and how concepts like PeerConnection, MediaStreams, and DataChannels are represented.
- Long term plans from Google to restructure the Web
LORENZ Building an integrated digital media archive and legal depositFIAT/IFTA
The document discusses building an integrated digital media archive and legal deposit system for preserving film and video assets in Slovenia. It summarizes the requirements presented by Vladimir Torov from the Ministry of Justice in Slovenia, which include analyzing the current analog system, choosing standards, identifying hardware and software solutions, and creating workflows. The key requirements are for hardware like high-end servers and film scanners, and customized software for a media asset management system, flexible workflow system, quality control checks, and user rights management. The presentation then discusses how Cube-Tec can provide solutions to meet these requirements, including verification of assets and metadata, a web-based media player, database import/export, and a flexible workflow system.
The document summarizes a presentation by Victor Pascual Avila and Antón Román Portabales on WebRTC standards updates from November 2014. It discusses the current state of WebRTC standards including supported audio and video codecs, signaling protocols, and interoperability with legacy VoIP/IMS networks. It also covers ongoing discussions around topics like preferred video codecs and the development of WebRTC browser APIs.
TechTalk: Connext DDS 5.2 - Faster and Easier Development of Industrial Internet Systems and Applications
Watch on-demand: https://youtu.be/j1G0MHC0Vwc
A Webinar by Victor Pascual Avila and Amir Zmora about WebRTC standards. IETF and W3C work on WebRTC as well as interworking with other networks such as IMS. The Webinar also talks about WebRTC signaling options and video codecs.
My talk on the excellent work Alessandro Toppi did at Meetecho on investigating the different code fuzzing options, and how it was eventually integrated in Janus for improving the robustness of the WebRTC stack (RTP, RTCP and SDP currently). It includes considerations on sharing corpora files and making this all distributed via OSS-Fuzz.
What is Software Development by Thesys Tech Head of DevelopmentProduct School
From this presentation you will learn how the modern software development life-cycle works and the differences between different data stores from a non-technical perspective.
Linux Distribution Collaboration …on a Mainframe!All Things Open
Presented at All Things Open 2023
Presented by Elizabeth K. Joseph - IBM
Title: Linux Distribution Collaboration …on a Mainframe!
Abstract: Linux has run on the mainframe architecture (s390x) for over 20 years now, and there’s even Linux-only mainframe hardware! But tight collaboration between the Linux distributions is rather new. Enter the Open Mainframe Project Linux Distributions Working Group, founded in late 2021.
Bringing together various Linux distributions, both corporate-backed and community-driven, representatives from openSUSE, Debian, Fedora, SUSE, and more immediately joined the effort to share bug reports and patches that impact all the distributions. Issues are often shared and discussed on the mailing list, and more complicated topics covered during the monthly meetings. The working group has a number of success stories that will be shared.
Future potential issues are also tackled, and notes shared about upstream changes that may soon impact the package processes. In the latest effort, the team has started thinking about actual upstream projects to invite to our group to be more pro-active about changes that may cause problems on the s390x architecture.
But more importantly, this is a story about community and collaboration. Many people view the various Linux distributions as a competitive space, but like so much of the open source software community, we are all more successful when we share knowledge about our core. The success of this working group, and growing enthusiasm for it from new Linux distributions who are joining, is a great example of this.
Find more info about All Things Open:
On the web: https://www.allthingsopen.org/
Twitter: https://twitter.com/AllThingsOpen
LinkedIn: https://www.linkedin.com/company/all-things-open/
Instagram: https://www.instagram.com/allthingsopen/
Facebook: https://www.facebook.com/AllThingsOpen
Mastodon: https://mastodon.social/@allthingsopen
Threads: https://www.threads.net/@allthingsopen
2023 conference: https://2023.allthingsopen.org/
The PSS AG was initially created to test IOT of 3GPP streaming standards but has shifted focus to HTTP streaming as the market evolved. It defines test cases and specifications, performs multi-vendor interoperability tests, and provides feedback to standards bodies. Tests are organized through regular calls, an IMTC reflector, and face-to-face events. The group's current focus is on developing basic MPEG-DASH streaming tests for the April 2012 SuperOp! event.
The document provides an overview of using Python for bioinformatics, discussing what Python is, why it is useful for bioinformatics, how to set up Python in integrated development environments like Eclipse with PyDev, how to share code using Git and GitHub, and includes examples of Hello World and bioinformatics programs in Python. It introduces Python and argues it is well-suited for bioinformatics due to its extensive standard libraries, ease of use, and wide adoption in science. The document demonstrates how to install Python, set up an IDE, create and run simple Python programs, and use version control with Git and GitHub to collaborate on projects.
Richard Briggs has over 20 years of programming experience including 15 years of experience with C programming in UNIX environments. He has worked as a software engineer for several companies developing device drivers, embedded systems, and networking software. His areas of expertise include Linux kernel programming, embedded systems, telecommunications, and networking protocols.
Richard Guy Briggs has over 20 years of programming experience including 15 years working with C in UNIX environments. His experience includes network programming for Linux kernels, embedded device driver programming, and telecommunications hardware driver development. He has worked as a senior software engineer and software engineer for several companies in Ottawa developing software for digital cameras
DevOps and 5G cloud native solutions supported by Computaris automated testin...Computaris
Looking ahead to future projects in the area of cloud and 5G, read about Computaris TOP Testing Suite as a key tool in the client’s technology roadmap.
The document discusses language interoperability and standards for translation projects. It introduces the Language Interoperability Portfolio (Linport), an open format for translation containers. It also describes the Translation Interoperability Protocol Package (TIPP), a standard proposed by the group Interoperability Now to enable full interoperability between translation management systems. TIPP uses a ZIP container with a manifest and payload to package translation assets like XLIFF files, terminology, and specifications. The document advocates for TIPP and the XLIFF:doc subset as solutions to issues with roundtripping between tools and a lack of standardization.
The Netflix Way to deal with Big Data ProblemsMonal Daxini
The document discusses Netflix's approach to handling big data problems. It summarizes Netflix's data pipeline system called Keystone that was built in a year to replace a legacy system. Keystone ingests over 1 trillion events per day and processes them using technologies like Kafka, Samza and Spark Streaming. The document emphasizes Netflix's culture of freedom and responsibility and how it helped the small team replace the legacy system without disruption while achieving massive scale.
- The document profiles Alberto Paro and his experience including a Master's Degree in Computer Science Engineering from Politecnico di Milano, experience as a Big Data Practise Leader at NTTDATA Italia, authoring 4 books on ElasticSearch, and expertise in technologies like Apache Spark, Playframework, Apache Kafka, and MongoDB. He is also an evangelist for the Scala and Scala.JS languages.
The document then provides an overview of data streaming architectures, popular message brokers like Apache Kafka, RabbitMQ, and Apache Pulsar, streaming frameworks including Apache Spark, Apache Flink, and Apache NiFi, and streaming libraries such as Reactive Streams.
WebRTC and SIP not just audio and video @ OpenSIPS 2024Lorenzo Miniero
Slides for my "WebRTC-to-SIP and back: it's not all about audio and video" presentation at the OpenSIPS Summit 2024.
They describe my prototype efforts to add gatewaying support for a few SIP application protocols (T.140 for real-time text and MSRP) to Janus via data channels, with the related implementation challenges and the interesting opportunities they open.
The slides for the "Fuzzing Janus for fun and profit" paper I presented at IPTComm 2019, in Chicago. Simon (Romano) came up with the title, as a homage to the famous "Smashing the stack for fun and profit" article.
Capital One Delivers Risk Insights in Real Time with Stream Processingconfluent
Speakers: Ravi Dubey, Senior Manager, Software Engineering, Capital One + Jeff Sharpe, Software Engineer, Capital One
Capital One supports interactions with real-time streaming transactional data using Apache Kafka®. Kafka helps deliver information to internal operation teams and bank tellers to assist with assessing risk and protect customers in a myriad of ways.
Inside the bank, Kafka allows Capital One to build a real-time system that takes advantage of modern data and cloud technologies without exposing customers to unnecessary data breaches, or violating privacy regulations. These examples demonstrate how a streaming platform enables Capital One to act on their visions faster and in a more scalable way through the Kafka solution, helping establish Capital One as an innovator in the banking space.
Join us for this online talk on lessons learned, best practices and technical patterns of Capital One’s deployment of Apache Kafka.
-Find out how Kafka delivers on a 5-second service-level agreement (SLA) for inside branch tellers.
-Learn how to combine and host data in-memory and prevent personally identifiable information (PII) violations of in-flight transactions.
-Understand how Capital One manages Kafka Docker containers using Kubernetes.
Watch the recording: https://videos.confluent.io/watch/6e6ukQNnmASwkf9Gkdhh69?.
WebRTC Standards & Implementation Q&A - The Internals of WebRTC Browsers Impl...Amir Zmora
This document summarizes a Q&A session on WebRTC standards and implementation with Amir Zmora and Dan Burnett. Some of the key topics discussed include:
- Differences between how Chrome and other browsers implement WebRTC compared to the libwebrtc C++ library, especially around capabilities like screen sharing.
- The layered architecture of how WebRTC is implemented in different browsers, with Chrome having additional layers for security and capabilities compared to the libwebrtc implementation.
- The classes and namespaces used to design the libwebrtc C++ library and how concepts like PeerConnection, MediaStreams, and DataChannels are represented.
- Long term plans from Google to restructure the Web
LORENZ Building an integrated digital media archive and legal depositFIAT/IFTA
The document discusses building an integrated digital media archive and legal deposit system for preserving film and video assets in Slovenia. It summarizes the requirements presented by Vladimir Torov from the Ministry of Justice in Slovenia, which include analyzing the current analog system, choosing standards, identifying hardware and software solutions, and creating workflows. The key requirements are for hardware like high-end servers and film scanners, and customized software for a media asset management system, flexible workflow system, quality control checks, and user rights management. The presentation then discusses how Cube-Tec can provide solutions to meet these requirements, including verification of assets and metadata, a web-based media player, database import/export, and a flexible workflow system.
The document summarizes a presentation by Victor Pascual Avila and Antón Román Portabales on WebRTC standards updates from November 2014. It discusses the current state of WebRTC standards including supported audio and video codecs, signaling protocols, and interoperability with legacy VoIP/IMS networks. It also covers ongoing discussions around topics like preferred video codecs and the development of WebRTC browser APIs.
TechTalk: Connext DDS 5.2 - Faster and Easier Development of Industrial Internet Systems and Applications
Watch on-demand: https://youtu.be/j1G0MHC0Vwc
A Webinar by Victor Pascual Avila and Amir Zmora about WebRTC standards. IETF and W3C work on WebRTC as well as interworking with other networks such as IMS. The Webinar also talks about WebRTC signaling options and video codecs.
My talk on the excellent work Alessandro Toppi did at Meetecho on investigating the different code fuzzing options, and how it was eventually integrated in Janus for improving the robustness of the WebRTC stack (RTP, RTCP and SDP currently). It includes considerations on sharing corpora files and making this all distributed via OSS-Fuzz.
What is Software Development by Thesys Tech Head of DevelopmentProduct School
From this presentation you will learn how the modern software development life-cycle works and the differences between different data stores from a non-technical perspective.
Linux Distribution Collaboration …on a Mainframe!All Things Open
Presented at All Things Open 2023
Presented by Elizabeth K. Joseph - IBM
Title: Linux Distribution Collaboration …on a Mainframe!
Abstract: Linux has run on the mainframe architecture (s390x) for over 20 years now, and there’s even Linux-only mainframe hardware! But tight collaboration between the Linux distributions is rather new. Enter the Open Mainframe Project Linux Distributions Working Group, founded in late 2021.
Bringing together various Linux distributions, both corporate-backed and community-driven, representatives from openSUSE, Debian, Fedora, SUSE, and more immediately joined the effort to share bug reports and patches that impact all the distributions. Issues are often shared and discussed on the mailing list, and more complicated topics covered during the monthly meetings. The working group has a number of success stories that will be shared.
Future potential issues are also tackled, and notes shared about upstream changes that may soon impact the package processes. In the latest effort, the team has started thinking about actual upstream projects to invite to our group to be more pro-active about changes that may cause problems on the s390x architecture.
But more importantly, this is a story about community and collaboration. Many people view the various Linux distributions as a competitive space, but like so much of the open source software community, we are all more successful when we share knowledge about our core. The success of this working group, and growing enthusiasm for it from new Linux distributions who are joining, is a great example of this.
Find more info about All Things Open:
On the web: https://www.allthingsopen.org/
Twitter: https://twitter.com/AllThingsOpen
LinkedIn: https://www.linkedin.com/company/all-things-open/
Instagram: https://www.instagram.com/allthingsopen/
Facebook: https://www.facebook.com/AllThingsOpen
Mastodon: https://mastodon.social/@allthingsopen
Threads: https://www.threads.net/@allthingsopen
2023 conference: https://2023.allthingsopen.org/
The PSS AG was initially created to test IOT of 3GPP streaming standards but has shifted focus to HTTP streaming as the market evolved. It defines test cases and specifications, performs multi-vendor interoperability tests, and provides feedback to standards bodies. Tests are organized through regular calls, an IMTC reflector, and face-to-face events. The group's current focus is on developing basic MPEG-DASH streaming tests for the April 2012 SuperOp! event.
The document provides an overview of using Python for bioinformatics, discussing what Python is, why it is useful for bioinformatics, how to set up Python in integrated development environments like Eclipse with PyDev, how to share code using Git and GitHub, and includes examples of Hello World and bioinformatics programs in Python. It introduces Python and argues it is well-suited for bioinformatics due to its extensive standard libraries, ease of use, and wide adoption in science. The document demonstrates how to install Python, set up an IDE, create and run simple Python programs, and use version control with Git and GitHub to collaborate on projects.
Richard Briggs has over 20 years of programming experience including 15 years of experience with C programming in UNIX environments. He has worked as a software engineer for several companies developing device drivers, embedded systems, and networking software. His areas of expertise include Linux kernel programming, embedded systems, telecommunications, and networking protocols.
Richard Guy Briggs has over 20 years of programming experience including 15 years working with C in UNIX environments. His experience includes network programming for Linux kernels, embedded device driver programming, and telecommunications hardware driver development. He has worked as a senior software engineer and software engineer for several companies in Ottawa developing software for digital cameras
DevOps and 5G cloud native solutions supported by Computaris automated testin...Computaris
Looking ahead to future projects in the area of cloud and 5G, read about Computaris TOP Testing Suite as a key tool in the client’s technology roadmap.
The document discusses language interoperability and standards for translation projects. It introduces the Language Interoperability Portfolio (Linport), an open format for translation containers. It also describes the Translation Interoperability Protocol Package (TIPP), a standard proposed by the group Interoperability Now to enable full interoperability between translation management systems. TIPP uses a ZIP container with a manifest and payload to package translation assets like XLIFF files, terminology, and specifications. The document advocates for TIPP and the XLIFF:doc subset as solutions to issues with roundtripping between tools and a lack of standardization.
The Netflix Way to deal with Big Data ProblemsMonal Daxini
The document discusses Netflix's approach to handling big data problems. It summarizes Netflix's data pipeline system called Keystone that was built in a year to replace a legacy system. Keystone ingests over 1 trillion events per day and processes them using technologies like Kafka, Samza and Spark Streaming. The document emphasizes Netflix's culture of freedom and responsibility and how it helped the small team replace the legacy system without disruption while achieving massive scale.
- The document profiles Alberto Paro and his experience including a Master's Degree in Computer Science Engineering from Politecnico di Milano, experience as a Big Data Practise Leader at NTTDATA Italia, authoring 4 books on ElasticSearch, and expertise in technologies like Apache Spark, Playframework, Apache Kafka, and MongoDB. He is also an evangelist for the Scala and Scala.JS languages.
The document then provides an overview of data streaming architectures, popular message brokers like Apache Kafka, RabbitMQ, and Apache Pulsar, streaming frameworks including Apache Spark, Apache Flink, and Apache NiFi, and streaming libraries such as Reactive Streams.
WebRTC and SIP not just audio and video @ OpenSIPS 2024Lorenzo Miniero
Slides for my "WebRTC-to-SIP and back: it's not all about audio and video" presentation at the OpenSIPS Summit 2024.
They describe my prototype efforts to add gatewaying support for a few SIP application protocols (T.140 for real-time text and MSRP) to Janus via data channels, with the related implementation challenges and the interesting opportunities they open.
WebRTC and SIP not just audio and video @ OpenSIPS 2024
Timed Text At Netflix
1. Timed Text at Netflix
Rohit Puri (rpuri@netflix.com)
Engineering Manager, Video Systems
Digital Supply Chain, Netflix
February 19, 2016 0
2. The Netflix Content Processing System
• Video systems team develops cloud-scalable systems and
tools
– Audio/Video: ingestion/inspections and packaging/DRM
• e.g., IMF, QuickTime, and MP4-DASH
– Timed Text: entire processing pipeline
• e.g., W3C TTML, W3C WebVTT
1
Ingestion
and
Inspections
Packaging
and
DRM
Trans-coding
source
from
content
partner
downloadable
to CDN
February 19, 2016
3. Acknowledgements
• Dae Kim (dakim@netflix.com)
• Shinjan Tiwary (stiwary@netflix.com)
• Harold Sutherland (hsutherland@netflix.com)
• David Ronca (dronca@netflix.com)
• Glenn Adams (Skynav Inc.) - member W3C
Timed Text Working Group (TTWG)
February 19, 2016 2
4. Netflix after January 6, 2016
• > 75 million subscribers
• ~ 190 countries
• > 12,000,000,000 hours streamed in Q4 2015
• 20+ languages
3February 19, 2016
5. Talk Outline
• History and Legacy Workflow
• New Workflow
• Standards Activity and Roadmap
4February 19, 2016
6. Talk Outline
• History and Legacy Workflow
• New Workflow
• Standards Activity and Roadmap
5February 19, 2016
7. A Brief History of Timed Text
• Latin Alphabet (2014 and before)
– First subtitles delivered in 2009. Bottom-centered, yellow,
italics and underline options (dfxp-simplesdh)
– Follow-up 2012 offering. multiple colors, background,
outline, generic font family, font size, position information
(dfxp-ls-sdh) (ls => ‘less simple’)
– First WebVTT output profile was created in 2013
• Global Subtitles (2015+)
– TTML2 (Timed Text Markup Language) based Japanese
subtitles in 2015
– Bidirectional text with worldwide launch in January 2016
February 19, 2016 6
8. Legacy Workflow (2014 and before)
• Two step-procedure
– source inspection
– source conversion to output
• Sources
– CEA-608 based Scenarist Closed Captions (.scc)
– EBU Subtitling data exchange format (.stl)
– SubRip (.srt)
– Timed Text Markup Language 1 (.ttml, .dfxp)
• Outputs
– feature restricted TTML1 outputs
– feature restricted WebVTT outputs
February 19, 2016 7
11. Talk Outline
• History and Legacy Workflow
• New Workflow
• Standards Activity and Roadmap
10February 19, 2016
12. Japanese Subtitles
• Essential Features
– vertical text
– ruby annotations
– horizontal-in-vertical
– Unicode
11February 19, 2016
13. Japanese Subtitles Challenges
• New source format - Videotron Lambda (.cap)
– inaccessible specification
– ambiguity in format
– non-interoperable vendor implementations
• TTML1 does not support essential Japanese features,
TTML2 needed
• Netflix device SDK did not support essential rendering
features
– delivery of image-based subtitles
– significant investment in open source software (OSS) tools
12February 19, 2016
14. Japanese Subtitles Conversion
Workflow (1-off)
• .cap sources converted to TTML2, and then to image subtitles or
webVTT (webVTT specification appears incomplete w.r.t. Japanese)
• Image subtitles archive contains .png images + manifest with timing
and positioning information
• Green modules available as OSS (https://github.com/skynav/ttt)
13
cap2tt
TTPETTX
WebVTT writer
Archiver
ttml2
ttml2 ISD .png
images
February 19, 2016
15. New Workflow: Inspections
14
TTML2
based
canonical
model
Semantic
inspections
STL parser
and syntax
inspections
SCC parser
and syntax
inspections
CAP parser
and syntax
inspections
TTML1,
TTML2,
IMSC1
SRT parser
and syntax
inspections
TTML family
parser and
syntax
inspections
model
serialized
to disk
Source
format and
charset
detection
input
file
February 19, 2016
16. New Workflow: Conversions
• Configurable filter-based architecture
• Bank of model-domain filters
• Output writer generates text or images
15
model-domain filter chain
deserialized
model FilterN
Output generator
Filter1
output
specific
filters
output
writer
output
file
February 19, 2016
17. Global Subtitles
• i18n-grade timed text processing workflow
• All languages of this world (+Klingon)
• New TTML2-based output profile (“nflx-ttml-gsdh”)
– vertical text
– horizontal-in-vertical
– ruby annotations
– bidirectional text
• Both text and image delivery options
16February 19, 2016
18. Talk Outline
• History and Legacy Workflow
• New Workflow
• Standards Activity and Roadmap
17February 19, 2016
19. Our Experience with Source Formats
• SCC
– limited to Latin alphabet
– non-standard use of SMPTE timecode
• EBU-STL
– dated1
, ambiguous, non-interoperable industry practices
– no support for Asian character set
• SRT
– Too simple - positioning information not used in practice
• LambdaCAP
– ambiguous format - hard to find official specification
• TTML1
– not self-contained - needs “Document Processing Context”
– no support for Japanese rendering features
18
1
“The medium for exchange is a 3.5-inch high-density portable magnetic disk (microfloppy). The disk is formatted for 1.44
Mbytes (2 sides, 80 tracks, 18 sectors/track).”
20. TTWG Standards
• IMSC1 (Internet Media Subtitles and Captions) is TTML1-
based W3C candidate recommendation; mandatory for IMF
– multiple Netflix sponsored implementations were announced to
TTWG on February 1, 2016
– Netflix plans 100% support for IMF
– Netflix ingest implementation will support IMSC-T (text profile)
• We have multiple TTML2 implementations in development -
will support TTML2-based IMSC2
• Netflix is enthusiastic about HTMLCue
19February 19, 2016
21. Open Source Activity
• regxmllib (https://github.com/sandflow/regxmllib)
– sponsored by Netflix
– tools that provide essential building blocks for authoring of
IMF CPL
• ttt (https://github.com/skynav/ttt)
– sponsored by Netflix
– tools for validation and rendering of TTML1/2
• photon (https://github.com/Netflix/photon) (December 2015)
– developed at Netflix
– complete set of tools for validation of IMF packages
February 19, 2016 20
22. Talk Summary
• Subtitles experience core to Netflix business
• Netflix committed to TTWG standards
– multiple IMSC1 and TTML2 implementations declared or
in flight
• Netflix plans to be 100% IMF
• Netflix is actively involved in OSS efforts around
timed text as well as IMF
21February 19, 2016
23. 2015 Tech Emmy for Netflix
• Netflix was a co-recipient of 2015 Technology and
Engineering Emmy Award “Standardization and Pioneering
Development of Non-Live Broadcast Captioning”
22