Abstract— The Media and Entertainment industry is in the midst of rapid evolution as movie studios and broadcast television productions move away from traditional analog formats to digital. Rapid advancements in display resolution and file formats such as 4k, Ultra High Definition, and High Frame Rate filming are placing additional stress on I.T. systems and increasing the demand for storage and network bandwidth. While many media organizations are already leveraging Internet technologies in their day -to- day operations, this paper will present concepts to improve the media creation process, creative collaboration, efficiency, and security via traditional network technologies, as well as newer advancements in cloud computing and software defined networking.
Emulex Presents Why I/O is Strategic Global Survey ResultsEmulex Corporation
This webcast is the first in a monthly series on why I/O is strategic for the data center. Emulex will present findings from a global survey of more than 1,500 IT professionals that demonstrate the strategic importance of I/O in the data center across four key technology trends: virtualization, cloud, big data and convergence.
Data compression has increased by leaps and bounds over the years due to technical innovation, enabling the proliferation of streamed digital multimedia and voice over IP. For example, a regular cadence of technical advancement in video codecs has led to massive reduction in file size – in fact, up to a 1000x reduction in file size when comparing a raw video file to a VVC encoded file. However, with the rise of machine learning techniques and diverse data types to compress, AI may be a compelling tool for next-generation compression, offering a variety of benefits over traditional techniques. In this presentation we discuss:
- Why the demand for improved data compression is growing
- Why AI is a compelling tool for compression in general
- Qualcomm AI Research’s latest AI voice and video codec research
- Our future AI codec research work and challenges
The Next Wave of 10GbE webcast with Crehan Research was held on 10/5 and focused on current and future 10GbE adapter and switch market drivers and adoption trends, and the effects of the introduction of 10GBASE-T products on the overall 10GbE market.
Fog Computing and Its Role in the Internet of ThingsHarshitParkar6677
Fog Computing extends the Cloud Computing paradigm to
the edge of the network, thus enabling a new breed of applications
and services. Dening characteristics of the Fog
are: a) Low latency and location awareness; b) Wide-spread
geographical distribution; c) Mobility; d) Very large number
of nodes, e) Predominant role of wireless access, f) Strong
presence of streaming and real time applications, g) Heterogeneity.
In this paper we argue that the above characteristics
make the Fog the appropriate platform for a number
of critical Internet of Things (IoT) services and applications,
namely, Connected Vehicle, Smart Grid , Smart
Cities, and, in general, Wireless Sensors and Actuators Networks
(WSANs).
Emulex Presents Why I/O is Strategic Global Survey ResultsEmulex Corporation
This webcast is the first in a monthly series on why I/O is strategic for the data center. Emulex will present findings from a global survey of more than 1,500 IT professionals that demonstrate the strategic importance of I/O in the data center across four key technology trends: virtualization, cloud, big data and convergence.
Data compression has increased by leaps and bounds over the years due to technical innovation, enabling the proliferation of streamed digital multimedia and voice over IP. For example, a regular cadence of technical advancement in video codecs has led to massive reduction in file size – in fact, up to a 1000x reduction in file size when comparing a raw video file to a VVC encoded file. However, with the rise of machine learning techniques and diverse data types to compress, AI may be a compelling tool for next-generation compression, offering a variety of benefits over traditional techniques. In this presentation we discuss:
- Why the demand for improved data compression is growing
- Why AI is a compelling tool for compression in general
- Qualcomm AI Research’s latest AI voice and video codec research
- Our future AI codec research work and challenges
The Next Wave of 10GbE webcast with Crehan Research was held on 10/5 and focused on current and future 10GbE adapter and switch market drivers and adoption trends, and the effects of the introduction of 10GBASE-T products on the overall 10GbE market.
Fog Computing and Its Role in the Internet of ThingsHarshitParkar6677
Fog Computing extends the Cloud Computing paradigm to
the edge of the network, thus enabling a new breed of applications
and services. Dening characteristics of the Fog
are: a) Low latency and location awareness; b) Wide-spread
geographical distribution; c) Mobility; d) Very large number
of nodes, e) Predominant role of wireless access, f) Strong
presence of streaming and real time applications, g) Heterogeneity.
In this paper we argue that the above characteristics
make the Fog the appropriate platform for a number
of critical Internet of Things (IoT) services and applications,
namely, Connected Vehicle, Smart Grid , Smart
Cities, and, in general, Wireless Sensors and Actuators Networks
(WSANs).
Big Data Analytics - A use case for 5G deployment Massimo Marino
Can and How Telco benefit from big data analytics?
Understanding 5G, Internet of Things, and Big Data.
Big Data Analytics for 5G networks offers an opportunity to gain a better picture of operations and customers, and to maximise profits and user satisfaction.

Achieving and managing a pro-active core network with advanced packet priority and routing management, and exploiting SDN technology will maximise the global throughput and minimise jitter and latency.
The use of the SDN technology is linked to the use of Big Data.
Founded in 2006, the HetNet Forum is the only national network of leaders focused on shaping the future of heterogeneous wireless networks (the HetNet). The HetNet is a wireless ecosystem, comprised of a variety of mobile and wireless technologies and infrastructure, interoperable with the macro-cellular network providing harmonious voice and data communications.
Flex coherent and open API bring a fresh, state-of-the-art software development approach to the broader community of network software developers.
They will be able to fully and directly control and monitor the rich transport feature set optimized for SDN and cloud use cases.
We envision a world where devices, machines, automobiles, and things are much more intelligent, simplifying and enriching our daily lives. They will be able to perceive, reason, and take intuitive actions based on awareness of the situation, improving just about any experience and solving problems that to this point we’ve either left to the user, or to more conventional algorithms.
Artificial intelligence (AI) is the technology driving this revolution. You may think that AI is really about big data and the cloud, and yet Qualcomm’s solutions already have the power, thermal, and processing efficiency to run powerful AI algorithms on the actual device. Our current products now support many AI use cases, such as computer vision, natural language processing, and malware detection — both for smartphones and autos — and we are researching broader topics, such as AI for wireless connectivity, power management, and photography. View this presentation to learn about our AI vision, including:
Why mobile is becoming the pervasive AI platform
The benefits of AI moving to the device and complementing the cloud
The benefits of distributed processing for AI
Qualcomm’s long history of AI research and development
What the future of AI processing might look like
CoreSite: The Data Center as the On-Ramp to Cloud EnablementRightScale
Speaker: Jarrett Appleby - COO, CoreSite
The cloud-enabled data center sits at the center of IT transformation. It facilitates the interconnection and communities that come together, propelling growth for both buyers and sellers. Learn how CoreSite is bringing together best-of-breed partners through the Open Cloud Exchange resulting in public, private, and hybrid cloud interconnection and management as well as connectivity to AWS Direct Connect.
Requiring only half the bitrate of its predecessor, the new standard – HEVC or H.265 – will significantly reduce the need for bandwidth and expensive, limited spectrum. HEVC (H.265) will enable the launch of new video services and in particular ultra HD television (UHDTV).
State-of-the-art video compression techniques – HEVC/H.265 – can reduce the size of raw video by a factor of about 100 without any noticeable reduction in visual quality. With estimates indicating that compressed real-time video accounts for more than 50 percent of current network traffic, and this figure is set to rise to 90 percent within a few years, HEVC/H.265 will be a welcome relief for network operators.
New services, devices and changing viewing patterns are among the factors contributing to the growth in video traffic as people watch more and more traditional TV and video-streaming services on their mobile devices.
Ericsson has been heavily involved in the standardization of HEVC since it began in 2010, and this Ericsson Review article highlights some of the contributions that have led to the compression efficiency offered by HEVC.
.
Cloud Aggregation: Smart Access to a Smarter CloudExponential_e
The noise in the market is all about placing data in the cloud – security,
compliance and risk. However, a good cloud strategy is underpinned with the network providing the access to cloud services and providers. How secure is the route to the cloud, is the infrastructure fit for purpose and is bandwidth scalable?
Big Data Analytics - A use case for 5G deployment Massimo Marino
Can and How Telco benefit from big data analytics?
Understanding 5G, Internet of Things, and Big Data.
Big Data Analytics for 5G networks offers an opportunity to gain a better picture of operations and customers, and to maximise profits and user satisfaction.

Achieving and managing a pro-active core network with advanced packet priority and routing management, and exploiting SDN technology will maximise the global throughput and minimise jitter and latency.
The use of the SDN technology is linked to the use of Big Data.
Founded in 2006, the HetNet Forum is the only national network of leaders focused on shaping the future of heterogeneous wireless networks (the HetNet). The HetNet is a wireless ecosystem, comprised of a variety of mobile and wireless technologies and infrastructure, interoperable with the macro-cellular network providing harmonious voice and data communications.
Flex coherent and open API bring a fresh, state-of-the-art software development approach to the broader community of network software developers.
They will be able to fully and directly control and monitor the rich transport feature set optimized for SDN and cloud use cases.
We envision a world where devices, machines, automobiles, and things are much more intelligent, simplifying and enriching our daily lives. They will be able to perceive, reason, and take intuitive actions based on awareness of the situation, improving just about any experience and solving problems that to this point we’ve either left to the user, or to more conventional algorithms.
Artificial intelligence (AI) is the technology driving this revolution. You may think that AI is really about big data and the cloud, and yet Qualcomm’s solutions already have the power, thermal, and processing efficiency to run powerful AI algorithms on the actual device. Our current products now support many AI use cases, such as computer vision, natural language processing, and malware detection — both for smartphones and autos — and we are researching broader topics, such as AI for wireless connectivity, power management, and photography. View this presentation to learn about our AI vision, including:
Why mobile is becoming the pervasive AI platform
The benefits of AI moving to the device and complementing the cloud
The benefits of distributed processing for AI
Qualcomm’s long history of AI research and development
What the future of AI processing might look like
CoreSite: The Data Center as the On-Ramp to Cloud EnablementRightScale
Speaker: Jarrett Appleby - COO, CoreSite
The cloud-enabled data center sits at the center of IT transformation. It facilitates the interconnection and communities that come together, propelling growth for both buyers and sellers. Learn how CoreSite is bringing together best-of-breed partners through the Open Cloud Exchange resulting in public, private, and hybrid cloud interconnection and management as well as connectivity to AWS Direct Connect.
Requiring only half the bitrate of its predecessor, the new standard – HEVC or H.265 – will significantly reduce the need for bandwidth and expensive, limited spectrum. HEVC (H.265) will enable the launch of new video services and in particular ultra HD television (UHDTV).
State-of-the-art video compression techniques – HEVC/H.265 – can reduce the size of raw video by a factor of about 100 without any noticeable reduction in visual quality. With estimates indicating that compressed real-time video accounts for more than 50 percent of current network traffic, and this figure is set to rise to 90 percent within a few years, HEVC/H.265 will be a welcome relief for network operators.
New services, devices and changing viewing patterns are among the factors contributing to the growth in video traffic as people watch more and more traditional TV and video-streaming services on their mobile devices.
Ericsson has been heavily involved in the standardization of HEVC since it began in 2010, and this Ericsson Review article highlights some of the contributions that have led to the compression efficiency offered by HEVC.
.
Cloud Aggregation: Smart Access to a Smarter CloudExponential_e
The noise in the market is all about placing data in the cloud – security,
compliance and risk. However, a good cloud strategy is underpinned with the network providing the access to cloud services and providers. How secure is the route to the cloud, is the infrastructure fit for purpose and is bandwidth scalable?
Edge computing allows data produced by internet of things (IoT) devices to be processed closer to where it is created instead of sending it across long routes to data centers or clouds.
Doing this computing closer to the edge of the network lets organizations analyze important data in near real-time – a need of organizations across many industries, including manufacturing, health care, telecommunications and finance.Edge computing deployments are ideal in a variety of circumstances. One is when IoT devices have poor connectivity and it’s not efficient for IoT devices to be constantly connected to a central cloud.
Other use cases have to do with latency-sensitive processing of information. Edge computing reduces latency because data does not have to traverse over a network to a data center or cloud for processing. This is ideal for situations where latencies of milliseconds can be untenable, such as in financial services or manufacturing.
Amagi's Cloud Technology for TV BroadcastersMaria Baker
Cloud computing can have tremendous advantages for broadcast businesses across a wide spectrum of business functions. We will discuss some of the key functions where there are immediate and big benefits.
Know more : http://www.amagi.com/tvnetworks
Javier Benitez's talk from the 2017 Open Networking Summit.
Colt is transforming the way they do business and offer network services to customers through the adoption of SDN & NFV as part of a company-wide transformation program called Novitas. Javier Benitez will focus in this presentation on active Colt’s Novitas developments, sharing with the audience their experience in deploying SDN & NFV solutions in production both for Ethernet and IP services, the learning associated as well as their future plans. In particular, Javier Benitez will cover Colt developments around Ethernet & IP on Demand, SD VPN, SDN controlled MPLS packet core and SDN/NFV NNI standardization.
When you use IPQ to boost network quality, you boost the user Quality of Experience (QoE) associated with almost any networked application from Microsoft.
A revolution is going on at the Edge of the Network.
Why Edge is important?
How Edge Computing is shaping the way we do IoT, AR/VR, Big Data, Machine Learning and Analytics applications.
What are the important problems and who’s problem is this?
What solutions Industry is looking into right now?
This review of the "Industry report by SDxCentral" summarizes what is going on in the Industry.
According to a new Gartner report1, “Around 10% of enterprise-generated data is created and processed outside a traditional centralized data center or cloud. By 2022, Gartner predicts this
figure will reach 75%”. In addition to hosting new 5G era services, the other major network operator driver for edge compute and edge clouds is deploying virtualized network infrastructure, replacing many dedicated hardware-based elements with virtual network functions (VNFs) running on general purpose edge compute. Even portions of access networks are being virtualized, and many of these functions need to be deployed close to end users. The combination of these infrastructure and applications drivers is a major reason that so much of 5G era network transformation resolves around edge cloud distribution.
An SDN Based Approach To Measuring And Optimizing ABR Video Quality Of Experi...Cisco Service Provider
Reprinted with permission of NCTA, from the 2014 Cable Connection Spring Technical Forum Conference Proceedings. For more information on Cisco video solutions, visit: http://www.cisco.com/c/en/us/products/video/index.html
Companies should
strive to incorporate more agility and SOFT in their
processes and IT systems, which will enable them to
respond faster to changes in customer requirements and
market conditions.
GCX Cloud X Launch Presentation (October 14th, 2014)Ahmed Abdel-Latif
Global Cloud Xchange Launches Transformational New Cloud Ecosystem: Delivering applications on a virtualized infrastructure, operating across the world’s largest private sub-sea cable system, the new enterprise-class Cloud switching structure enables automated deployment of Cloud services –a market first.
Similar to Media and Entertainment Network Exchange Concept (20)
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex ProofsAlex Pruden
This paper presents Reef, a system for generating publicly verifiable succinct non-interactive zero-knowledge proofs that a committed document matches or does not match a regular expression. We describe applications such as proving the strength of passwords, the provenance of email despite redactions, the validity of oblivious DNS queries, and the existence of mutations in DNA. Reef supports the Perl Compatible Regular Expression syntax, including wildcards, alternation, ranges, capture groups, Kleene star, negations, and lookarounds. Reef introduces a new type of automata, Skipping Alternating Finite Automata (SAFA), that skips irrelevant parts of a document when producing proofs without undermining soundness, and instantiates SAFA with a lookup argument. Our experimental evaluation confirms that Reef can generate proofs for documents with 32M characters; the proofs are small and cheap to verify (under a second).
Paper: https://eprint.iacr.org/2023/1886
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex Proofs
Media and Entertainment Network Exchange Concept
1. Abstract— The Media and Entertainment industry
is in the midst of rapid evolution as movie studios
and broadcast television productions move away
from traditional analog formats to digital. Rapid
advancements in display resolution and file formats
such as 4k, Ultra High Definition, and High Frame
Rate filming are placing additional stress on I.T.
systems and increasing the demand for storage and
network bandwidth. While many media
organizations are already leveraging Internet
technologies in their day -to- day operations, this
paper will present concepts to improve the media
creation process, creative collaboration, efficiency,
and security via traditional network technologies, as
well as newer advancements in cloud computing
and software defined networking.
I. INTRODUCTION
n December 2013, Paramount Studios announced
that “Anchorman 2” would be the last movie
Paramount would release for traditional 35mm film
distribution. This was promptly followed by the release
of “Wolf of Wall Street” in digital format only. Other
major studios such as 20th
Century Fox and Disney
have made similar announcements concerning all
digital distribution models for future releases.1
While this was a historic moment for the Media
Industry, digital distribution is only one avenue fueling
the increasing need for network bandwidth and storage.
As the creative process moves from 2K to 4K, and
eventually 8K, the media industry will outpace the
abilities of traditional storage mediums to deliver
content in a timely manner and become economically
untenable in traditional CAPEX models. A 2013
Coughlin Associates report surveyed SMPTE and HPA
members on current and future storage needs for digital
productions, highlight this explosive growth in
storage.2
• Several petabytes of storage may be required for a
complete stereoscopic digital movie project at 4K
resolution, and there is some production work as
high as 8K.
• Within 10 years we could see close to an Exabytes
of content created in a single major movie project
• Storage in remote “clouds” is playing an increasing
role in enabling collaborative workflows.
• Between 2013 and 2018 we expect about a 5.8 X
increase in the required digital storage capacity
used in the entertainment industry
• Active archiving will drive increased use of HDD
storage for “archiving” applications supplementing
tape for long term archives
Film Length 2K 4K 8K
1 minute 14GB 28GB 71GB
5 minutes 71GB 143GB 358GB
60 minutes 859GB 1.7TB 4.3TB
Required storage as resolution increases -12 bit color 24fps RGB 4:4:4
Creative collaboration is a key component across
theatrical and television productions often, involving
multiple geographical locations, artists and vendors.
Regional and International tax credits continue to
geographically disperse filming locations making
working with these large data sets in a timely manner
extremely difficult. This often results in manual
intervention and movement of physical assets e.g. disk
and tape.
II. MEDIA NETWORK EXCHANGE & GLOBAL MEDIA
ON-RAMPS
Internet Service Providers (ISP) and Network Carriers
have been leveraging regional network peer points to
exchange traffic since the early 1990’s. This allowed
for the rapid scaling of the Internet and providers to
facilitate the movement of web content across
networks owned by another carrier or ISP. As these
regional peer points grew in size, they began to give
the Internet a physical location due to the aggregation
of fiber optic cabling, service providers, and network
traffic. Today, Equinix is North America’s largest
network peering provider, with key locations in
Ashburn, Silicon Valley, Los Angeles, Chicago, Dallas
and New York. Equinix also maintains a substantial
global footprint, with peer locations across Europe and
Asia.
Media Network Exchange Concept
Jason Banks – 2/11/2014
I
2. Cloud service providers are following a deployment
pattern similar to the early network exchange points.
With the major Internet Exchange Points (IXP)
becoming the crossroads of cloud services, network
providers, content producers, and content consumers.
A July 2014 Gigaom article highlighted these trends in
location aware cloud services, in order to address client
concerns over user latency, data protection, and
redundancy.3
These same network and cloud intersection points are
becoming increasingly important to the entertainment
industry as productions chase tax breaks in various
locations and seek to leverage cloud services to reduce
cost. Overlaying the current map of production film
credits, we discover that many key shooting locations
are within 1,000 kilometers of a major IXP, as well as
the larger public cloud providers, such as AWS,
Google, MS Azure, and Rackspace.
Tax breaks in relationship to major Equinix IXPs
Media and entertainment can apply the same principals
used in traditional network peering relationships to
establish an ecosystem that facilitates the creative
process and reduces data movements. Many of these
regional locations already have substantial
infrastructure in place to facilitate the uploading of
data to vital co-location and network exchange points.
Additionally Software Defined Network (SDN) and
Network Functions Virtualization (NFV) is reshaping
how metro Ethernet and WAN services are delivered.
With many providers developing methods to deliver
metered, on-demand, elastic network services similar
to the current cloud model.4
Teaming these
advancements in a content-peered ecosystem made up
of content producers and vendors will offer flexible
network options bound by less restrictive network
services contracts at a reduced cost, opposed to
traditional year or multi-year network services
contracts in shoot locations that are only used for a few
weeks a year.
Location Studios Exchange
L.A.
U.S. West
6 Majors + L.A.
Silicon Valley
New York
U.S. East
Silver Cup, Steiner,
Kaufman, Broadway Stages
Ashburn
New York
Chicago
U.S. South Pinewood, Raleigh Studios,
Millennium
Ashburn
Dallas
Canada Toronto and Montreal New York
Ashburn
Chicago
Major North America Studios in relation to closest Equinix IXP
Advantages of Network and Entertainment Exchange
• Optimal locations to take advantage of cloud
services
• Optimal locations for remote work over PCoIP
• Large ecosystem of services including network
providers, CDN, cloud vendors, and social
networks
• Ability to bypass carriers and peer directly to the
global Internet, opposed to purchasing point to
point circuits
• Leverage IXP footprint for bulk data transfer
between regional locations using the Internet
• Leverage cloud vendor networks to move content
via locations and regions
• Potential to use secure federate storage to ease data
movements between co-located partners and
vendors
• As SDN evolves, underlying transport will become
more data aware and networks more content aware
• Carrier neutral data centers and Internet Exchange
Points (flexibility for studios and their vendors to
garner services suited to business requirements)
3. III. REDUCING DATA MOVEMENTS & LEVERAGING
CLOUD SERVICES
Growing data sets will require compute and storage to
be co-located and accessible via low latency network
connectivity over the WAN. Every 1,000km of
distance will add approximately 5ms of additional
network latency. Transferring data to and from cloud
services will add additional network overhead as traffic
passes through additional network devices to reach
cloud storage.
Teradici developed the PC-over-IP (PCoIP) protocol in
2008; the protocol provides lossless compression of
video and sound over the WAN and has become one of
the primary tools for remote desktops and artist
workstations. Network latency has a direct impact on
PCoIP’s ability to deliver seamless playback of audio,
video and pixel level artistry. Research by video card
manufactures have shown the ideal level of WAN
latency is between 0 – 30ms.5
Remote workstations for
visual artists will become the norm, as companies seek
to maintain the control and security of theatrical IP and
reduce the movement of large data sets. Industrial
Light and Magic, for example, is already successfully
applying these solutions between Los Angles, Silicon
Valley, and Vancouver Canada.6
The ability to work
remotely provides several advantages.
• Reduce the need to move large data sets across the
WAN
• Improves security by locking data to the content
owners data center and storage pools
• Cost savings through using zero / thin clients
opposed to purchasing high-powered desktops for
each artist
Network
Latency
(ms)
Approximate
Distance
(km)
User Experience
0-30 0 - 1500 Perception free
40-60 1500 - 2500 Minimal latency
60-100 2500 - 5000 Sluggish mouse;
poor audio
> 100 > 5000 Visual slowness;
audio dropout
Latency effects on PCoIP traffic
Remote artist workstations are only one potential
use case. As cloud adoption accelerates, cloud
bursting for rendering, applications for dailies
review, VFX collaboration, and archiving will all
require similar WAN performance. By building a
WAN latency map around tax breaks locations
and cloud provider data centers, the same
intersection points correspond with the previously
identified regional Internet Exchange Points.
Courtesy of Equinix – AWS latency in relation to Equinix IXP
Combining best in breed vendors, cloud services,
federated storage, and software defined networking,
new I.T. architectures can deliver traditional services
more efficiently and in new ways.
Cloud provider backbones can be utilized to transport
data between different cloud regions as well as
business locations. Current products such as Front
Porch’s Digital Diva and Lynx Cloud can
automatically cache and replicate data across facilities.
Co-location and federated private cloud storage,
fronted by digital access management systems like 5th
Kind, can be used to secure and share limited datasets
with content service providers, providing audit trails
and allowing content owners to maintain control over
critical IP.
By leveraging best in class datacenter providers and
regional Internet Exchange Points to deliver optimal
performance and tie the ecosystem together, we can
produce architectures that combine public and private
4. cloud services, facilitate remote work, and improve
security.
Example architecture combing public and private clouds
IV. CONCULSION
Production content is growing at extreme rates across
the media industry. While this poses potential issues
for the Media Industry in terms of cost, efficiencies,
and security, many technologies already exist to
mitigate or minimize the ramifications. Applying these
advancements in new and creative ways will be a
critical component for industry success. The content
ecosystem based around a media exchange presented in
this paper is one possible solution. It seeks to address
current and future challenges not only through
technology but also by understanding the economics
and production complexities shaping the business
environment.
REFERENCES
1. Verrier, R. (2014, January 18). Paramount stops releasing major movies
on film. Retrieved October 1, 2014, from
http://articles.latimes.com/2014/jan/18/entertainment/la-et-ct-paramount-
end-to-film-20140118
2. Professional Media and Entertainment Drives Storage Growth (Forbes)
http://www.forbes.com/sites/tomcoughlin/2013/07/28/professional-media-
and-entertainment-drives-storage-growth/
3. The next big front for cloud competition: Location, location, location.
(n.d.). Retrieved October 1, 2014, from
https://gigaom.com/2014/07/26/the-next-big-front-for-cloud-competition-
location-location-location/
4. The Metro Ethernet Forum Tackles the NaaS Challenge
https://www.sdncentral.com/news/metro-ethernet-forum-tackles-naas-
challenge/2014/09/
5. Teradici IML case study
http://www.teradici.com/docs/default-source/resources/case-
studies/cs_industrial-light-amp-magic-final-7-14-14.pdf
6. EVGA PCoIP User Guide
http://www.evga.com/support/manuals/files/PCoIP_User_Guide.pdf