Scott Janus will present on the value and technical challenges of high dynamic range (HDR) and wide color gamut (WCG) displays. HDR allows for content with a wider range of brightness and more colors, but requires new displays. It provides a more realistic visual experience than standard dynamic range. Implementing HDR faces technical hurdles around representing and processing the increased brightness and color levels.
HDR and WCG Video Broadcasting Considerations.pdfssuserc5a4dd
Elements of High-Quality Image Production–Color Gamut Conversion (Gamut Mapping and Inverse Gamut Mapping)–Gamma, OETF, EOTF, OOTF, PQ and HLG HDR–HDR & SDR Mastering, Mapping, Tone Mapping and Inverse Tone Mapping–Static and Dynamic Tone Mapping–Backwards Compatibility–HDR and WCG Production Equipment –DVB UHD Phases–HDR Metadata and HDR Standards –PQ10 and HLG10 Distrib
HDR and WCG Video Broadcasting Considerations.pdfssuserc5a4dd
Elements of High-Quality Image Production–Color Gamut Conversion (Gamut Mapping and Inverse Gamut Mapping)–Gamma, OETF, EOTF, OOTF, PQ and HLG HDR–HDR & SDR Mastering, Mapping, Tone Mapping and Inverse Tone Mapping–Static and Dynamic Tone Mapping–Backwards Compatibility–HDR and WCG Production Equipment –DVB UHD Phases–HDR Metadata and HDR Standards –PQ10 and HLG10 Distrib
Learn how to exploit the power of High Dynamic Ranges thanks to this tutorial.
There are several ways to create HDR pics - Do you know them all?
Thanks to this technique you won't lose a single detail in your photographs.
Inspired by the 2010 Ethical Literacy Conference featured presentation, “Embedding Social Responsibility into School Culture,” this interactive session will illustrate how students can be empowered to contribute and take responsibility for the culture of their school. Chris Cooper and Jill Reilly will explore ways to promote student ownership and social responsibility by using a student-centered approach to positive behavior intervention, called GO HDR! (Giving back, Ownership, Honest, Dependable, Respectful). They will also discuss ways of increasing awareness of GO HDR across the schools, including building values rubrics for various school settings, using the Honor Council as a driving force, and advancing GO HDR using student led activities.
Long-term, H.265 will likely succeed H.264’s position as the premier solution for advanced video, though that may depend on whether or not battery consumption while decoding can match H.264’s levels in the long term. That’s something we’ll only be able to evaluate once hardware is available, but for now we’re optimistic. H.265’s explicitly parallel model should map well against multi-core devices of the future.
Encode/decode support, meanwhile, is already going to be possible on a vast range of products. Modern CPUs are more than capable of decoding H.265 in software, OpenCL support is coming in future iterations, and hardware GPU support, while not formally guaranteed by AMD, Intel, or Nvidia for next-generation products, is a mid-term certainty. All three companies have previously leapt to include advanced video pipelines in their products — as the H.265 presentation notes, video is something that’s become ubiquitous across every type of device
New developments in image capture and delivery will affect entire entertainme...Technicolor
New developments in image capture and delivery will affect entire entertainment technology value chain - A conversation with Kirk Barker, Senior Vice President, Emerging Products, Technology Division, Technicolor
Technologies such as Ultra-High Definition (UHD), High Dynamic Range (HDR) and Wide Color Gamut (WCG) are disruptive agents of change in the film and television industry. At the heart of each is a better entertainment experience for the viewing public, but each has its challenges and benefits.
In this Q&A, Technicolor Senior Vice President Kirk Barker shares his insights on how UHD, HDR and other technologies will evolve. He also discusses what their impact will be on the various players who are leveraging the technologies, from consumer electronics manufacturers to film distributors, and how Technicolor is differentiating itself with its offerings.
The real world scenes have a very wide range of luminance levels. But in the field of photography, the ordinary cameras are not capable of capturing the true dynamic range of a natural scene. To enhance the dynamic range of the captured image, a technique known as High Dynamic Range (HDR) imaging is generally used. HDR imaging is the process of capturing scenes with larger intensity range than what conventional sensors can capture. It can faithfully capture the details in dark and bright part of the scene. In this paper HDR generation method such as multiple exposure fusion in image domain and radiance domain are reviewed. The main issues in HDR imaging using multiple exposure combination technique are Misalignment of input images, Noise in data sets and Ghosting artefacts. The removal of these artefacts is a major step in HDR reconstruction. Methods for removing misalignment and noise are discussed and detailed survey of ghost detection and removal techniques are given in this paper. Single shot HDR imaging is a recent technique in the field of HDR reconstruction. Here instead of taking multiple exposure input images, a single image is used for generating HDR image. Various methods for Single shot HDR imaging are also reviewed.
A quick walk-through some of the ways one can use HDR Imaging tools to enhance their photographic imagery. HDR is a tool, not an end-all solution, and requires fundamental photographic skills in order to leverage this technique to your advantage.
Elemental high dynamic _ range_video_white_paperCMR WORLD TECH
FROM SCIENCE TO PRACTICE
The next large challenge facing the video industry is translating the science behind HDR into a system or
systems that can actually perform the required tasks of making HDR a reality for consumers and provide
a return on investment for providers. This adds complexity by bringing the laboratory into the
marketplace.
Between Bitmovin and Dolby Laboratories, we've heard a lot about video quality from our customers and from the industry. Together, we've compiled a short-list of the myths that surround HDR and Dolby Vision. This slide deck is an overview of the collaborative webinar where we debunked the top 5 myths we've identified in the industry:
Myth 1: Dolby Vision is a premium feature and not necessary for my applications
Myth 2: HDR is far to difficult to add and scale within a standard video workflow
Myth 3: Resolution is more important than dynamic range (8K and beyond)
Myth 4: Implementing and using Dolby Vision
is too expensive and licensing is too complicated
Myth 5: Very few devices support Dolby Vision
Watch the full webinar to learn why these are just Myths and how we can help you resolve them: https://go.bitmovin.com/webinar-dolby-hdr-myths?utm_source=slideshare&utm_medium=social&utm_campaign=premiumcontent
Ultra HD: Sooner than expected? - White Paper - The Future TrustTechnicolor
From the first digital TV services from DirecTV in 1994 and the Mpeg2-‐based UK service launch in 1998, to the 2006 introduction of mass market HDTV in Europe based on Mpeg4 AVC, there has been a continuous effort to optimize encoder technologies. The goal has always been the same: enhanced picture quality using less bandwidth. Recent heightened interest in 4K/UHD services goes along with this search for bandwidth optimization; as an encoder technology, HEVC appears as a natural progression.
This white paper aims to provide a clear overview of where the industry stands on 4K/UHD services at the beginning of 2014, following numerous product introductions at CES this year, all along the E2E chain.
In the wake of predictions of slow 4K/UHD deployment, this paper examines the rapid progress achieved over the past year and reviews ways to improve image quality through up-‐scaled HD, which enables providers to deliver a 4K/UHD experience without native 4K/UHD content. Thanks to embedded HEVC decoding capabilities and future proof technology for native 4K/UHD content, the next generation of set-‐top boxes can make 4K/UHD a reality earlier than expected for NSPs who want to accelerate their migration and stay ahead of the competition.
After a half-century reign, the CRT has been overthrown as a direct view or rear projection display in a market now dominated by pixel-based varieties. How do LCD, plasma, PDP, DLP, HTPS, and LCOS technologies\' strengths and weaknesses compare with respect to size, image quality and pricing.
We examine detailed components of image quality, including potential for future improvement. Furthermore, what does resolution mean with respect to optimum image size in the living room? And finally, we discuss how image “information content” versus “resolution” helps explain why some displays are perceived as better than others.
HDR Insights Article 2 : PQ and HLG transfer functions for HDRVeneraTech
In this article of HDR insight series, you’ll learn more about transfer functions, and two specific transfer functions for High Dynamic Range, PQ & HLG.
Freedom in Lighting Design by Tuning the CCT with LEDs, LpS 2015, BregenzWojtek Cieplik
This presentation will look at the principles of colour mixing with an explanation of why two light sources made up of different wavelengths may appear to be the same colour. The connection between the colour qualities of electric light sources and human mood, health and productivity will also be a major focus.
8K resolution, or 8K UHD, is the current highest ultra high definition television (UHDTV)
resolution in digital television, computer graphics and digital cinematography. 8K refers to the
horizontal resolution of 7,680 pixels, forming the total image dimensions of (7680×4320),
otherwise known as 4320p.
This guide book prepared by Dolby, Harmonic and TDG offers a detailed examination of the technological innovations and standards that are defining UHD video and audio, as well as a realistic assessment of the market dynamics that will determine the pace at which UHD matures and diffuses.
Mobile Extended Reality (XR) is likely to become one of the world’s most disruptive computing platforms. It is expected to transform the way we interact with the world around us every day, delivering unprecedented new experiences and the potential to exponentially increase productivity. XR is inherently meant to be mobile, intuitive and always connected. Many new technologies in the areas of low power visual processing, cognition, and connectivity are required for this vision to become reality. This presentation discusses:
• A view of the evolution of XR from today to the future
• Examples of unprecedented experiences that XR is expected to enable
• Necessary technology advancements required in areas such as 3D graphics, computer vision, next-gen displays, machine learning, and wireless connectivity to support a new class of intelligent, and personalized XR experiences
https://www.qualcomm.com/invention/extended-reality
Check out our CDS Company Presentation including our industrial, gaming and commercial display products and services overview.
Includes who we are, what we do and what we offer including; TFT LCD panels, industrial interface cards, touchscreen technologies, optical bonding, transparent displays, industrial, retail and commercial monitors, AIO panel PCs, custom build monitors and more.
For more information please visit our website - http://crystal-display.com/
Or contact us directly for project help and info on +44 (0) 1634 327420 or email info@crystal-display.com
Getting started with High-Definition Render Pipeline for games- Unite Copenha...Unity Technologies
Learn to assess when to use the High-Definition Render Pipeline (HDRP) and how to start using it for production.
Speaker:
Jennifer S. Roig-Deslandes - Unity
Watch the session on Youtube: https://youtu.be/-GzGpDGfeTA
Similar to High-Dynamic Range (HDR) Demystified (20)
AI for All: Biology is eating the world & AI is eating Biology Intel® Software
Advances in cell biology and creation of an immense amount of data are converging with advances in Machine learning to analyze this data. Biology is experiencing its AI moment and driving the massive computation involved in understanding biological mechanisms and driving interventions. Learn about how cutting edge technologies such as Software Guard Extensions (SGX) in the latest Intel Xeon Processors and Open Federated Learning (OpenFL), an open framework for federated learning developed by Intel, are helping advance AI in gene therapy, drug design, disease identification and more.
Python Data Science and Machine Learning at Scale with Intel and AnacondaIntel® Software
Python is the number 1 language for data scientists, and Anaconda is the most popular python platform. Intel and Anaconda have partnered to bring scalability and near-native performance to Python with simple installations. Learn how data scientists can now access oneAPI-optimized Python packages such as NumPy, Scikit-Learn, Modin, Pandas, and XGBoost directly from the Anaconda repository through simple installation and minimal code changes.
Streamline End-to-End AI Pipelines with Intel, Databricks, and OmniSciIntel® Software
Preprocess, visualize, and Build AI Faster at-Scale on Intel Architecture. Develop end-to-end AI pipelines for inferencing including data ingestion, preprocessing, and model inferencing with tabular, NLP, RecSys, video and image using Intel oneAPI AI Analytics Toolkit and other optimized libraries. Build at-scale performant pipelines with Databricks and end-to-end Xeon optimizations. Learn how to visualize with the OmniSci Immerse Platform and experience a live demonstration of the Intel Distribution of Modin and OmniSci.
AI for good: Scaling AI in science, healthcare, and more.Intel® Software
How do we scale AI to its full potential to enrich the lives of everyone on earth? Learn about AI hardware and software acceleration and how Intel AI technologies are being used to solve critical problems in high energy physics, cancer research, financial inclusion, and more. Get started on your AI Developer Journey @ software.intel.com/ai
Software AI Accelerators: The Next Frontier | Software for AI Optimization Su...Intel® Software
Software AI Accelerators deliver orders of magnitude performance gain for AI across deep learning, classical machine learning, and graph analytics and are key to enabling AI Everywhere. Get started on your AI Developer Journey @ software.intel.com/ai.
Advanced Techniques to Accelerate Model Tuning | Software for AI Optimization...Intel® Software
Learn about the algorithms and associated implementations that power SigOpt, a platform for efficiently conducting model development and hyperparameter optimization. Get started on your AI Developer Journey @ software.intel.com/ai.
Reducing Deep Learning Integration Costs and Maximizing Compute Efficiency| S...Intel® Software
oneDNN Graph API extends oneDNN with a graph interface which reduces deep learning integration costs and maximizes compute efficiency across a variety of AI hardware including AI accelerators. Get started on your AI Developer Journey @ software.intel.com/ai.
AWS & Intel Webinar Series - Accelerating AI ResearchIntel® Software
Scale your research workloads faster with Intel on AWS. Learn how the performance and productivity of Intel Hardware and Software help bridge the gap between ideation and results in Data Science. Get started on your AI Developer Journey @ software.intel.com/ai.
Whether you are an AI, HPC, IoT, Graphics, Networking or Media developer, visit the Intel Developer Zone today to access the latest software products, resources, training, and support. Test-drive the latest Intel hardware and software products on DevCloud, our online development sandbox, and use DevMesh, our online collaboration portal, to meet and work with other innovators and product leaders. Get started by joining the Intel Developer Community @ software.intel.com.
Advanced Single Instruction Multiple Data (SIMD) Programming with Intel® Impl...Intel® Software
Explore practical elements, such as performance profiling, debugging, and porting advice. Get an overview of advanced programming topics, like common design patterns, SIMD lane interoperability, data conversions, and more.
Build a Deep Learning Video Analytics Framework | SIGGRAPH 2019 Technical Ses...Intel® Software
Explore how to build a unified framework based on FFmpeg and GStreamer to enable video analytics on all Intel® hardware, including CPUs, GPUs, VPUs, FPGAs, and in-circuit emulators.
Review state-of-the-art techniques that use neural networks to synthesize motion, such as mode-adaptive neural network and phase-functioned neural networks. See how next-generation CPUs with reinforcement learning can offer better performance.
RenderMan*: The Role of Open Shading Language (OSL) with Intel® Advanced Vect...Intel® Software
This talk focuses on the newest release in RenderMan* 22.5 and its adoption at Pixar Animation Studios* for rendering future movies. With native support for Intel® Advanced Vector Extensions, Intel® Advanced Vector Extensions 2, and Intel® Advanced Vector Extensions 512, it includes enhanced library features, debugging support, and an extensive test framework.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
3. 3
Presentation Goals
Explain the value of High Dynamic Range (HDR) and Wide Color Gamut (WCG)
Explain the technical challenges of deploying HDR + WCG
4. 4
What is HDR?
Practically speaking:
Content with a wider range of brightness and color
– Also an increased number of brightness and color levels
Experiencing HDR content requires new displays
5. 5
Let’s talk about color
and brightness…
Really they are tightly
interwoven, but let’s
consider them
individually for now
8. 8
Monitors Can Produce a Finite Range of Colors
Typically, red, green, and blue subpixels.
If you modulate each primary 0-100%, you can
envision a three-dimensional color cube
All colors in this cube are the gamut of colors
the device can reproduce
9. 9
Let’s simplify things…
Humans are bad at perceiving
volumetric data, so compress the
color volume to two dimensions
Chromaticity Diagram
10. 10
Wide Color
• Most PCs and HDTVs use 709 gamut
• Majority of next generation content
uses Bt.2020 gamut
• 2015 UHDTVs are ~85% Bt.2020
11. 11
Bt 2020 content
displayed unmodified
on 709 display
Bt 2020 content
gamut-mapped for
709 display
We must perform gamut mapping
12. 12
Advantages of Wide Color
We can reproduce lifelike colors we
couldn’t before and create more
immersive experiences
14. What is HDR (in this context)
Practically speaking:
Content with a wider range of brightness and color
Requires new monitors to experience HDR content
– Legacy content: 100 nit peak brightness, 709 gamut
– HDR content: 10,000 nit peak brightness, 2020 gamut
HDR is an ambiguous term
There are several industry specs defining various flavors of HDR
15. 15
(new) HDR: what it is not
“HDR Photography”
2005-era HDR rendering/gaming
Such as Half-Life2: Lost Coast demo
Both of these techniques generate images designed to be shown on an SDR monitor
17. Terminology
Luminance
Quantitative measurement of amount of light passing through an area
Can be unequivocally measured by instruments
Linearly proportional to # of photons
Brightness
Subjective human perception of luminance
Varies wildly based on ambient conditions and from human to human
Non-linearly proportional to # of photons
24. 24
TL;DR
Standard Dynamic Range:
De facto brightness range of current content and displays
High Dynamic Range:
Substantially brighter next-generation content and displays
25. 25
Standard Dynamic Range
For the past 80 years, video has been graded to appear properly on a 100 nit
display
However, no adjustments are made to comprehend displays of differing
luminance
– 30 nit laptops in low power mode
– 600 nit HDTVs in Vivid mode
Similarly, no adjustments to the content are made when you surf the web and
play games, switching from phones to tablets to PCs.
This is wrong, but has been good enough
26. 26
Disambiguation: HDR Photography
Multiple exposures from different times combined off-
line to create a single SDR image which has different
exposure levels for different areas of the picture
Not what we’re talking about today
27. 27
HDR Video
Single exposure captures a wide range of luminance at a
single instant in time
Intended to be displayed on an HDR display
28. 28
Things to ponder
With color, many real-world hues can be exactly reproduced on screens
Almost every real-world situation has objects at >100 nits
So the real world is HDR, and every SDR picture you take involves HDR->SDR
conversion
A 100-nit object on the screen almost never corresponds to a 100-nit object in
the real world
29. 29
Multiple ranges of HDR
0
nits
10,000
nits
ContainerCapture
15,000
nits
Movie DisplayContainerCapture Movie
30. 30
HDR Conversions
Capture Movie HDR Display
Mastering
Performed
by studio
creatives
Handled by
player
Handled
by
UHDTV
HDR->HDR
SDR Display
35. Linear vs non-linear light
In the real world, luminance is determined by the number of photons
However, brightness (the human perception of luminance) is non-linearly
proportional to number of photons
36. Code Words
7
6
5
4
3
2
1
0
36
Directly storing luminance is inefficient
To prevent banding using SDR, you would need
to use 13-14 bits to code a contemporary 100-
nit signal
Luminance(linearscale)
Just noticeable difference
38. 38
Problems
Although gamma is a reasonable approximation of human perception in the
SDR (0-100 nit) range…
Gamma is not a good match for human perception in the 0-10,000 nit HDR
If you use gamma for a 10,000 nit signal, you need to use two extra bits/sample
to eliminate banding
39. 39
SMPTE 2084: HDR Electro-Optical Transfer Function (EOTF)
0
1
0 1
Luminance
Perceptually Uniform Video Signal
Gamma
EOTF
40. Blending
On computers, we usually blend colors in non-linear space
Blended color= 0.5*(A+B)
This is fast and easy, but wrong because:
Brightness A is really Luminance A’=A2.2
Blended color ≠ 0.5*( A + B)
Correct answer:
Blended color = 0.5(A’+B’) -> (0.5*( A2.2 + B2.2)) 1/2.2
43. Scaling
The same math applies not just to blending, but spatial scaling operations as well
Scaling involves blending pixels together
Scaling in non-linear space will generate errors
44. 44
Advantages of HDR
HDR creates much more lifelike experiences
• “Like looking out a window”
• “Like you’re really there”
HDR done right is clearly distinguishable from existing content and displays
46. 46
Summary
HDR + WCG gives us powerful new tools to create visual experiences that are
clearly visible to the average viewer
You can watch HDR+WCG movies right now
Making this all work requires lots of changes to the production pipeline and the
products used to display it
47. 47
Call to Action
Make more HDR content and build more systems capable of playing HDR
Make sure you have a really compelling HDR experience before calling it HDR
Upcoming Intel products have cool HDR features…