Learn how the Boat Attack demo was created using the Universal Render Pipeline. These slides offer an in-depth look at the features used in the demo, including Shader Graph, Custom Render Passes, Camera Callback, and more.
Speaker:
Andre McGrail - Unity Technologies
Watch the session on YouTube: https://youtu.be/ZPQdm1T7aRs
Taking Killzone Shadow Fall Image Quality Into The Next GenerationGuerrilla
This talk focuses on the technical side of Killzone Shadow Fall, the platform exclusive launch title for PlayStation 4.
We present the details of several new techniques that were developed in the quest for next generation image quality, and the talk uses key locations from the game as examples. We discuss interesting aspects of the new content pipeline, next-gen lighting engine, usage of indirect lighting and various shadow rendering optimizations. We also describe the details of volumetric lighting, the real-time reflections system, and the new anti-aliasing solution, and include some details about the image-quality driven streaming system. A common, very important, theme of the talk is the temporal coherency and how it was utilized to reduce aliasing, and improve the rendering quality and image stability above the baseline 1080p resolution seen in other games.
Secrets of CryENGINE 3 Graphics TechnologyTiago Sousa
In this talk, the authors will describe an overview of a different method for deferred lighting approach used in CryENGINE 3, along with an in-depth description of the many techniques used. Original file and videos at http://crytek.com/cryengine/presentations
The presentation describes Physically Based Lighting Pipeline of Killzone : Shadow Fall - Playstation 4 launch title. The talk covers studio transition to a new asset creation pipeline, based on physical properties. Moreover it describes light rendering systems used in new 3D engine built from grounds up for upcoming Playstation 4 hardware. A novel real time lighting model, simulating physically accurate Area Lights, will be introduced, as well as hybrid - ray-traced / image based reflection system.
We believe that physically based rendering is a viable way to optimize asset creation pipeline efficiency and quality. It also enables the rendering quality to reach a new level that is highly flexible depending on art direction requirements.
Taking Killzone Shadow Fall Image Quality Into The Next GenerationGuerrilla
This talk focuses on the technical side of Killzone Shadow Fall, the platform exclusive launch title for PlayStation 4.
We present the details of several new techniques that were developed in the quest for next generation image quality, and the talk uses key locations from the game as examples. We discuss interesting aspects of the new content pipeline, next-gen lighting engine, usage of indirect lighting and various shadow rendering optimizations. We also describe the details of volumetric lighting, the real-time reflections system, and the new anti-aliasing solution, and include some details about the image-quality driven streaming system. A common, very important, theme of the talk is the temporal coherency and how it was utilized to reduce aliasing, and improve the rendering quality and image stability above the baseline 1080p resolution seen in other games.
Secrets of CryENGINE 3 Graphics TechnologyTiago Sousa
In this talk, the authors will describe an overview of a different method for deferred lighting approach used in CryENGINE 3, along with an in-depth description of the many techniques used. Original file and videos at http://crytek.com/cryengine/presentations
The presentation describes Physically Based Lighting Pipeline of Killzone : Shadow Fall - Playstation 4 launch title. The talk covers studio transition to a new asset creation pipeline, based on physical properties. Moreover it describes light rendering systems used in new 3D engine built from grounds up for upcoming Playstation 4 hardware. A novel real time lighting model, simulating physically accurate Area Lights, will be introduced, as well as hybrid - ray-traced / image based reflection system.
We believe that physically based rendering is a viable way to optimize asset creation pipeline efficiency and quality. It also enables the rendering quality to reach a new level that is highly flexible depending on art direction requirements.
Filmic Tonemapping for Real-time Rendering - Siggraph 2010 Color Coursehpduiker
Filmic Tonemapping for Real-time Rendering, a presentation from the Siggraph 2010 Course on Color, on a technique developed from film that became very applicable to games with the addition of support for HDR lighting and rendering in graphics cards.
Past, Present and Future Challenges of Global Illumination in GamesColin Barré-Brisebois
Global illumination (GI) has been an ongoing quest in games. The perpetual tug-of-war between visual quality and performance often forces developers to take the latest and greatest from academia and tailor it to push the boundaries of what has been realized in a game product. Many elements need to align for success, including image quality, performance, scalability, interactivity, ease of use, as well as game-specific and production challenges.
First we will paint a picture of the current state of global illumination in games, addressing how the state of the union compares to the latest and greatest research. We will then explore various GI challenges that game teams face from the art, engineering, pipelines and production perspective. The games industry lacks an ideal solution, so the goal here is to raise awareness by being transparent about the real problems in the field. Finally, we will talk about the future. This will be a call to arms, with the objective of uniting game developers and researchers on the same quest to evolve global illumination in games from being mostly static, or sometimes perceptually real-time, to fully real-time.
Talk by Fabien Christin from DICE at GDC 2016.
Designing a big city that players can explore by day and by night while improving on the unique visual from the first Mirror's Edge game isn't an easy task.
In this talk, the tools and technology used to render Mirror's Edge: Catalyst will be discussed. From the physical sky to the reflection tech, the speakers will show how they tamed the new Frostbite 3 PBR engine to deliver realistic images with stylized visuals.
They will talk about the artistic and technical challenges they faced and how they tried to overcome them, from the simple light settings and Enlighten workflow to character shading and color grading.
Takeaway
Attendees will get an insight of technical and artistic techniques used to create a dynamic time of day system with updating radiosity and reflections.
Intended Audience
This session is targeted to game artists, technical artists and graphics programmers who want to know more about Mirror's Edge: Catalyst rendering technology, lighting tools and shading tricks.
A technical deep dive into the DX11 rendering in Battlefield 3, the first title to use the new Frostbite 2 Engine. Topics covered include DX11 optimization techniques, efficient deferred shading, high-quality rendering and resource streaming for creating large and highly-detailed dynamic environments on modern PCs.
Checkerboard Rendering in Dark Souls: Remastered by QLOCQLOC
This is a talk on checkerboard rendering Markus & Andreas held at Digital Dragons 2019.
In it they quickly go through the history of Checkerboard Rendering before taking a deep dive into how it works and how it is implemented in Dark Souls: Remastered. Lastly, they present the quality and performance improvements they got from using it and their conclusion.
PS: The PDF. file includes useful in-depth notes from both authors.
Colin Barre-Brisebois - GDC 2011 - Approximating Translucency for a Fast, Che...Colin Barré-Brisebois
Presentation from Game Developers Conference 2011 (GDC2011), presented by Colin Barre-Brisebois and Marc Bouchard. A rendering technique for real-time translucency rendering, implemented in DICE's Frostbite 2 engine, featured in Battlefield 3.
Physically Based Lighting in Unreal Engine 4Lukas Lang
Talk held at Unreal Meetup Munich on 15th May 2019.
I talked about some of the theoretical background of physically based lighting, demonstrated a workflow + containing value tables needed to be able to easily use the workflow.
A Certain Slant of Light - Past, Present and Future Challenges of Global Illu...Electronic Arts / DICE
Global illumination (GI) has been an ongoing quest in games. The perpetual tug-of-war between visual quality and performance often forces developers to take the latest and greatest from academia and tailor it to push the boundaries of what has been realized in a game product. Many elements need to align for success, including image quality, performance, scalability, interactivity, ease of use, as well as game-specific and production challenges.
First we will paint a picture of the current state of global illumination in games, addressing how the state of the union compares to the latest and greatest research. We will then explore various GI challenges that game teams face from the art, engineering, pipelines and production perspective. The games industry lacks an ideal solution, so the goal here is to raise awareness by being transparent about the real problems in the field. Finally, we will talk about the future. This will be a call to arms, with the objective of uniting game developers and researchers on the same quest to evolve global illumination in games from being mostly static, or sometimes perceptually real-time, to fully real-time.
This presentation was given at SIGGRAPH 2017 by Colin Barré-Brisebois (EA SEED) as part of the Open Problems in Real-Time Rendering course.
Screen Space Decals in Warhammer 40,000: Space MarinePope Kim
My Siggraph 2012 presentation slides on Screen Space Decals in Warhammer 40,000: Space Marine.
SSD is similar to Deferred Decals, so I focused more on the problems we had and how we solved(or avoided) them
There are a lot of articles about games. Most of these are about particular aspects of a game like rendering or physics. All engines, however, have a binding structure that ties all aspects of the game together. Usually there is a base class (Object, Actor or Entity are common names) that all objects in the game derive from, but very little is written on the subject. Only very recently a couple of talks on game|tech have briefly touched on the subject. Still, choosing a structure to build your game on is very important. The end user might not “see” the difference between a good and a bad structure, but this choice will affect many aspects of the development process. A good structure will reduce risk and increase the efficiency of the team.
Optimizing the Graphics Pipeline with Compute, GDC 2016Graham Wihlidal
With further advancement in the current console cycle, new tricks are being learned to squeeze the maximum performance out of the hardware. This talk will present how the compute power of the console and PC GPUs can be used to improve the triangle throughput beyond the limits of the fixed function hardware. The discussed method shows a way to perform efficient "just-in-time" optimization of geometry, and opens the way for per-primitive filtering kernels and procedural geometry processing.
Takeaway:
Attendees will learn how to preprocess geometry on-the-fly per frame to improve rendering performance and efficiency.
Intended Audience:
This presentation is targeting seasoned graphics developers. Experience with DirectX 12 and GCN is recommended, but not required.
Bill explains some of the ways that the Vertex Shader can be used to improve performance by taking a fast path through the Vertex Shader rather than generating vertices with other parts of the pipeline in this AMD technology presentation from the 2014 Game Developers Conference in San Francisco March 17-21. Check out more technical presentations at http://developer.amd.com/resources/documentation-articles/conference-presentations/
The goal of this session is to demonstrate techniques that improve GPU scalability when rendering complex scenes. This is achieved through a modular design that separates the scene graph representation from the rendering backend. We will explain how the modules in this pipeline are designed and give insights to implementation details, which leverage GPU''s compute capabilities for scene graph processing. Our modules cover topics such as shader generation for improved parameter management, synchronizing updates between scenegraph and rendering backend, as well as efficient data structures inside the renderer.
Video here: http://on-demand.gputechconf.com/gtc/2013/video/S3032-Advanced-Scenegraph-Rendering-Pipeline.mp4
Talk by Yuriy O’Donnell at GDC 2017.
This talk describes how Frostbite handles rendering architecture challenges that come with having to support a wide variety of games on a single engine. Yuriy describes their new rendering abstraction design, which is based on a graph of all render passes and resources. This approach allows implementation of rendering features in a decoupled and modular way, while still maintaining efficiency.
A graph of all rendering operations for the entire frame is a useful abstraction. The industry can move away from “immediate mode” DX11 style APIs to a higher level system that allows simpler code and efficient GPU utilization. Attendees will learn how it worked out for Frostbite.
Filmic Tonemapping for Real-time Rendering - Siggraph 2010 Color Coursehpduiker
Filmic Tonemapping for Real-time Rendering, a presentation from the Siggraph 2010 Course on Color, on a technique developed from film that became very applicable to games with the addition of support for HDR lighting and rendering in graphics cards.
Past, Present and Future Challenges of Global Illumination in GamesColin Barré-Brisebois
Global illumination (GI) has been an ongoing quest in games. The perpetual tug-of-war between visual quality and performance often forces developers to take the latest and greatest from academia and tailor it to push the boundaries of what has been realized in a game product. Many elements need to align for success, including image quality, performance, scalability, interactivity, ease of use, as well as game-specific and production challenges.
First we will paint a picture of the current state of global illumination in games, addressing how the state of the union compares to the latest and greatest research. We will then explore various GI challenges that game teams face from the art, engineering, pipelines and production perspective. The games industry lacks an ideal solution, so the goal here is to raise awareness by being transparent about the real problems in the field. Finally, we will talk about the future. This will be a call to arms, with the objective of uniting game developers and researchers on the same quest to evolve global illumination in games from being mostly static, or sometimes perceptually real-time, to fully real-time.
Talk by Fabien Christin from DICE at GDC 2016.
Designing a big city that players can explore by day and by night while improving on the unique visual from the first Mirror's Edge game isn't an easy task.
In this talk, the tools and technology used to render Mirror's Edge: Catalyst will be discussed. From the physical sky to the reflection tech, the speakers will show how they tamed the new Frostbite 3 PBR engine to deliver realistic images with stylized visuals.
They will talk about the artistic and technical challenges they faced and how they tried to overcome them, from the simple light settings and Enlighten workflow to character shading and color grading.
Takeaway
Attendees will get an insight of technical and artistic techniques used to create a dynamic time of day system with updating radiosity and reflections.
Intended Audience
This session is targeted to game artists, technical artists and graphics programmers who want to know more about Mirror's Edge: Catalyst rendering technology, lighting tools and shading tricks.
A technical deep dive into the DX11 rendering in Battlefield 3, the first title to use the new Frostbite 2 Engine. Topics covered include DX11 optimization techniques, efficient deferred shading, high-quality rendering and resource streaming for creating large and highly-detailed dynamic environments on modern PCs.
Checkerboard Rendering in Dark Souls: Remastered by QLOCQLOC
This is a talk on checkerboard rendering Markus & Andreas held at Digital Dragons 2019.
In it they quickly go through the history of Checkerboard Rendering before taking a deep dive into how it works and how it is implemented in Dark Souls: Remastered. Lastly, they present the quality and performance improvements they got from using it and their conclusion.
PS: The PDF. file includes useful in-depth notes from both authors.
Colin Barre-Brisebois - GDC 2011 - Approximating Translucency for a Fast, Che...Colin Barré-Brisebois
Presentation from Game Developers Conference 2011 (GDC2011), presented by Colin Barre-Brisebois and Marc Bouchard. A rendering technique for real-time translucency rendering, implemented in DICE's Frostbite 2 engine, featured in Battlefield 3.
Physically Based Lighting in Unreal Engine 4Lukas Lang
Talk held at Unreal Meetup Munich on 15th May 2019.
I talked about some of the theoretical background of physically based lighting, demonstrated a workflow + containing value tables needed to be able to easily use the workflow.
A Certain Slant of Light - Past, Present and Future Challenges of Global Illu...Electronic Arts / DICE
Global illumination (GI) has been an ongoing quest in games. The perpetual tug-of-war between visual quality and performance often forces developers to take the latest and greatest from academia and tailor it to push the boundaries of what has been realized in a game product. Many elements need to align for success, including image quality, performance, scalability, interactivity, ease of use, as well as game-specific and production challenges.
First we will paint a picture of the current state of global illumination in games, addressing how the state of the union compares to the latest and greatest research. We will then explore various GI challenges that game teams face from the art, engineering, pipelines and production perspective. The games industry lacks an ideal solution, so the goal here is to raise awareness by being transparent about the real problems in the field. Finally, we will talk about the future. This will be a call to arms, with the objective of uniting game developers and researchers on the same quest to evolve global illumination in games from being mostly static, or sometimes perceptually real-time, to fully real-time.
This presentation was given at SIGGRAPH 2017 by Colin Barré-Brisebois (EA SEED) as part of the Open Problems in Real-Time Rendering course.
Screen Space Decals in Warhammer 40,000: Space MarinePope Kim
My Siggraph 2012 presentation slides on Screen Space Decals in Warhammer 40,000: Space Marine.
SSD is similar to Deferred Decals, so I focused more on the problems we had and how we solved(or avoided) them
There are a lot of articles about games. Most of these are about particular aspects of a game like rendering or physics. All engines, however, have a binding structure that ties all aspects of the game together. Usually there is a base class (Object, Actor or Entity are common names) that all objects in the game derive from, but very little is written on the subject. Only very recently a couple of talks on game|tech have briefly touched on the subject. Still, choosing a structure to build your game on is very important. The end user might not “see” the difference between a good and a bad structure, but this choice will affect many aspects of the development process. A good structure will reduce risk and increase the efficiency of the team.
Optimizing the Graphics Pipeline with Compute, GDC 2016Graham Wihlidal
With further advancement in the current console cycle, new tricks are being learned to squeeze the maximum performance out of the hardware. This talk will present how the compute power of the console and PC GPUs can be used to improve the triangle throughput beyond the limits of the fixed function hardware. The discussed method shows a way to perform efficient "just-in-time" optimization of geometry, and opens the way for per-primitive filtering kernels and procedural geometry processing.
Takeaway:
Attendees will learn how to preprocess geometry on-the-fly per frame to improve rendering performance and efficiency.
Intended Audience:
This presentation is targeting seasoned graphics developers. Experience with DirectX 12 and GCN is recommended, but not required.
Bill explains some of the ways that the Vertex Shader can be used to improve performance by taking a fast path through the Vertex Shader rather than generating vertices with other parts of the pipeline in this AMD technology presentation from the 2014 Game Developers Conference in San Francisco March 17-21. Check out more technical presentations at http://developer.amd.com/resources/documentation-articles/conference-presentations/
The goal of this session is to demonstrate techniques that improve GPU scalability when rendering complex scenes. This is achieved through a modular design that separates the scene graph representation from the rendering backend. We will explain how the modules in this pipeline are designed and give insights to implementation details, which leverage GPU''s compute capabilities for scene graph processing. Our modules cover topics such as shader generation for improved parameter management, synchronizing updates between scenegraph and rendering backend, as well as efficient data structures inside the renderer.
Video here: http://on-demand.gputechconf.com/gtc/2013/video/S3032-Advanced-Scenegraph-Rendering-Pipeline.mp4
Talk by Yuriy O’Donnell at GDC 2017.
This talk describes how Frostbite handles rendering architecture challenges that come with having to support a wide variety of games on a single engine. Yuriy describes their new rendering abstraction design, which is based on a graph of all render passes and resources. This approach allows implementation of rendering features in a decoupled and modular way, while still maintaining efficiency.
A graph of all rendering operations for the entire frame is a useful abstraction. The industry can move away from “immediate mode” DX11 style APIs to a higher level system that allows simpler code and efficient GPU utilization. Attendees will learn how it worked out for Frostbite.
Distributed high-quality image manipulation and review in a virtual collabora...ETCenter
Taking advantage of centralized processing and storage, new dispersed workflows are now possible. Colorfront's cloud initiative enables virtual world wide collaboration for high end motion picture and television production.
The Visual Effect Graph gives artists of all experience levels the power to create amazing particle VFX. In this intermediate-level session, Julien Fryer and Vlad Neykov from our development team will give you a sneak peek into how to generate millions of GPU-based particles in real-time using the Visual Effects Graph's toolset.
Vlad Neykov - Unity Technologies
Julien Fryer - Unity Technologies
Learn how to use the Lightweight Render Pipeline to optimize for maximum performance on mobile platforms. After watching this, you will be ready to build an experience for standalone mobile VR headsets, like Oculus Go, and how to use 3-DOF controllers for interactions.
Speakers:
Dan Miller (Unity Technologies)
For many years it has been possible to access graphical application via remote desktop software. In recent years Cloud computing has become more prominent and is a crucial
computing paradigm.
Android has captured a large market share. The challenge addressed in this talk is to efficiently export Android graphics so as to support standard Android apps remotely.
More information can be found at: http://www.ascender.com/remote-graphics
For many years it has been possible to access graphical application via remote desktop software. In recent years Cloud computing has become more prominent and is a crucial
computing paradigm.
Android has captured a large market share. The challenge addressed in this talk is to efficiently export Android graphics so as to support standard Android apps remotely.
More information can be found at: http://www.ascender.com/remote-graphics
A presentation I made at OpenStack Summit in Paris (November 2014) showing the Remote Rendering plateform built in the XLCloud project. The main topic of the presentation is related to optimizing the video encoding by analysing the images and user attention.
Introduction to Software Defined Visualization (SDVis)Intel® Software
Software defined visualization (SDVis) is an open-source initiative from Intel and industry collaborators. Improve the visual fidelity, performance, and efficiency of prominent visualization solutions, while supporting the rapidly growing big data use on workstations through high-performance computing (HPC) on supercomputing clusters without memory limitations and cost of GPU-based solutions.
Upcoming rendering technology including scriptable render pipelines, advanced lighting options and more.
Presenter: Arisa Scott (Graphis Product Manager, Unity Technologies)
- Vector- and Raster-based Graphics
-- Idea behind Vector- and Raster-based Graphics
-- Crispness
-- Overview of Raster-based Drawing APIs
- Platform independent Graphics and GUIs in the Web Browser
-- Bare HTML Pages
-- Plugins and Problems
-- From rich Content to HTML 5
- Drawing with HTML 5 Canvas
-- Continuous, Event driven and free Drawing
-- Basic Drawing "How does Drawing work with JavaScript?"
-- Interaction with Controls
Similar to How the Universal Render Pipeline unlocks games for you - Unite Copenhagen 2019 (20)
Using synthetic data for computer vision model trainingUnity Technologies
During this webinar Unity’s computer vision team provides an overview of computer vision, walks through current real-world data workflows, and explains why companies are moving toward synthetically generated data as an alternate data source for model training.
Watch the webinar: https://resources.unity.com/ai-ml/cv-webinar-dec-2021
The Tipping Point: How Virtual Experiences Are Transforming Global IndustriesUnity Technologies
When it comes to emerging technology, Forrester found that “94% of those who have implemented real-time 3D are expanding their investment.”
Wonder why? Learn more in this webinar featuring guest speaker Paul Miller, a principal analyst at Forrester. He covers the key findings of a commissioned study conducted by Forrester Consulting on behalf of Unity, published in March 2020.
Learn more: https://on.unity.com/2Yz49kg
Watch the webinar: https://on.unity.com/3aYGlsF
Take a peek at the slide from the second installment of our 2020 roadmap: Live Games.
Watch the presentation on YouTube: https://www.youtube.com/watch?v=w6sn8bJiZ2g
Got questions about the roadmap? Check out the Q&A over on the Unity forum: https://on.unity.com/2wV3SwD
Take a peek at the slide from the first installment of our 2020 roadmap: Core Engine & Creator Tools.
Watch the presentation (hosted by Will Goldstone, Product Manager) on YouTube: https://www.youtube.com/watch?v=dDjsS4NPqFU
Got questions about the roadmap? Check out the Q&A over on the Unity forum: https://on.unity.com/CreateRoadmapQA
How ABB shapes the future of industry with Microsoft HoloLens and Unity - Uni...Unity Technologies
It's high time for augmented reality to be brought to a wider audience. In ABB, we know that it is not just a gimmick any more. However, with every innovative technology comes new challenges. In these slides, we show how to overcome them and deliver valuable products with Hololens and Unity.
Speakers:
Maciej Włodarczyk - ABB
Rafał Kielar - ABB
Watch the session on YouTube: https://youtu.be/QFsj8Pi_3Ho
Unity XR platform has a new architecture – Unite Copenhagen 2019Unity Technologies
Unity developed a new architecture that improves the support for existing and future augmented reality (AR) and virtual reality (VR) platforms. Learn about the technology under the hood, the consequent benefits, and improvements to the platform, and how it impacts your workflows in creating AR/VR experiences.
Speakers: Mike Durand, Matt Fuad - Unity
Watch the session on Youtube: https://youtu.be/Stqk1GxlSK0
Autodesk and Unity announced a collaboration last year to streamline workflows and enable seamless development across the AEC design, build and operate lifecycles. This fall, Unity Reflect launches, giving designers, architects, and engineers the ability to seamlessly federate their Revit models for real-time 3D.
Andrew Sullivan - Digital Delivery Manager, SHoP Architects will provide an overview of how they are using the product to enable real-time decision making, reduce the time between revisions and meetings, and ultimately improve design review and construction planning processes.
Recording available here: https://youtu.be/qe0yxHA0fHI
How Daimler uses mobile mixed realities for training and sales - Unite Copenh...Unity Technologies
Daimler Protics implemented mixed and augmented reality on mobile devices and used the Microsoft HoloLens for automotive production, training, and marketing. Discover the challenges Daimler Protics faced and the Unity solutions that eased the mixed reality implementation.
Speakers:
Daniel Keßelheim - Daimler Protics
Sebastian Rigling - Daimler Protics
Session available here: https://youtu.be/fTc1c8iTGqU
How Volvo embraced real-time 3D and shook up the auto industry- Unite Copenha...Unity Technologies
Hear from Volvo's lead Unity developer, Timmy Ghiurau, about how he broke new ground by bringing technology forged in gaming into one of the leading brands in the automotive industry. Timmy will share how he used his gaming background to inspire people across his large organization to adopt Unity and embrace real-time 3D as a way of working.
Timmy Ghiurau - Volvo
Session available here: https://youtu.be/CD4Go3Uv5Uc
QA your code: The new Unity Test Framework – Unite Copenhagen 2019Unity Technologies
Are you involved in testing or QA on projects in Unity? In these slides, you'll get an overview of the state of Unity for all things testing-related, and have the opportunity to share your stories of success, failure, pain, and glory. Learn from your fellow developers and give feedback on how Unity could help you hold your projects to a higher standard of quality. You will also get an introduction to the newest features in the Test Framework.
Speakers:
Christian Warnecke - Unity
Richard Fine - Unity
Watch the session on YouTube: https://youtu.be/wTiF2D0_vKA
Engineering.com webinar: Real-time 3D and digital twins: The power of a virtu...Unity Technologies
From buildings and infrastructure to industrial machinery and factories, digital twins are becoming integral across the industrial sector. In this webinar, first shown on Engineering.com, leaders from Unity and Unit040, provider of digital twin platform Prespective, share how digital twins add value at all stages of the project and product lifecycle, from the early stages of design to predictive maintenance using IoT data.
Watch the webinar here: create.unity3d.com/real-time-3d-and-digital-twins
Supplying scalable VR training applications with Innoactive - Unite Copenhage...Unity Technologies
Major automotive brands like Volkswagen are leveraging the power of virtual reality to create immersive training programs that can be delivered across multiple global locations at the same time. Learn how to scale the production and distribution of real-time VR training in enterprise.
Speakers:
Thomas Wimmer - Innoactive
Andreea Raducan - Innoactive
Watch the session on YouTube: https://youtu.be/5DNFUTfyOEc
XR and real-time 3D in automotive digital marketing strategies | Visionaries ...Unity Technologies
Augmented reality (AR), virtual reality (VR), and mixed reality (MR) – collectively known as XR – are making inroads in the automotive industry. Join this session led by Visionaries 777, which works with major auto brands like INFINITI, to learn about the range of immersive experiences you can build with Unity to create a better customer experience that results in more engagement and sales.
Speakers:
David Castañeda - Visionaries 777
Frantz Lasorne - Visionaries 777
Session available here: https://youtu.be/WJpeWHGXyms
Real-time CG animation in Unity: unpacking the Sherman project - Unite Copenh...Unity Technologies
Get a complete walkthrough of the end-to-end animation workflow of the Sherman project. Learn how to use Unity for creating CG animation and take a deep dive into the real-time fur system in Unity.
Speaker:
Mike Wuetherick - Unity
Watch the session on YouTube: https://youtu.be/fFfWxErJMkY
Creating next-gen VR and MR experiences using Varjo VR-1 and XR-1 - Unite Cop...Unity Technologies
The developers of Varjo VR-1 learned a lot about human eye resolution and the demands it puts on virtual reality (VR) content. In these slides, you'll explore what next-generation VR can mean for your VR experiences. Learn about what matters the most when it comes to visual quality, the possible caveats, and the role performance requirements play in this equation.
Speaker:
Mikko Strandborg - Varjo
What's ahead for film and animation with Unity 2020 - Unite Copenhagen 2019Unity Technologies
Unity is enabling film and animation studios to revolutionize their pipelines with features developed specifically to empower storytellers who are creating linear and interactive content. Learn more about features such as Python, Shotgun, the Arbitrary Output Variables (AOV) used in Recorder for export, Alembic, and Universal Scene Description (USD).
Speaker:
Mathieu Muller - Unity
Watch the session on YouTube: https://youtu.be/wrc3R-BoDGs
How to Improve Visual Rendering Quality in VR - Unite Copenhagen 2019Unity Technologies
VR allows for an entirely new level of immersion, leading to more thrilling and engaging content to be delivered and is growing rapidly. Despite this, VR, especially on mobile, currently contains a number of limitations, which can make it an unrealistic, unconvincing and, sometimes, an uncomfortable experience. Virtual reality (VR) is a new way to deliver thrilling and engaging content and allows for a deep level of immersion. Despite this, VR, especially on mobile, currently has several limitations, which can make it an unrealistic, unconvincing and, sometimes, an uncomfortable experience. To achieve the true potential of VR, these limitations must be either solved or mitigated. Ways of mitigating these limitations include optimal alpha compositing approaches, texture filtering techniques and bump mapping methods for use with VR content. In these slides, technology company Arm will outline how to improve the rendering quality of your VR content, describing the most common pitfalls and bad practices, before providing clear examples and mitigation solutions of how to best overcome them.
Speaker:
Ryan O'Shea - ARM
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Leading Change strategies and insights for effective change management pdf 1.pdf
How the Universal Render Pipeline unlocks games for you - Unite Copenhagen 2019
1.
2. How the Universal
Render Pipeline unlocks
games for you
Felipe Lira, Graphics Programmer
André McGrail, Technical Artist
3. Unity Rendering Design
333
● Scriptable Render Pipeline
○ C# rendering API layer
○ Open Source
https://github.com/Unity-Technologies/ScriptableRenderPipeline
Unity SRP
4. Unity Rendering Design
444
Unity
● Scriptable Render Pipeline
○ C# rendering layer API
○ Developers can create custom rendering solution
Built-in
SRP
Custom RP
5. Unity Rendering Design
555
Unity
● Scriptable Render Pipeline
○ Built-in pipeline has several limitations
○ Developing a custom pipeline requires deep
graphics knowledge
○ How to fill the gap?
Built-in
SRP
Custom RP
G
A
P
6. Unity Rendering Design
666
Unity
● We provide two new render pipeline solution:
○ High Definition Render Pipeline (HDRP)
○ Universal Render Pipeline (URP)
Built-in
HDRP
SRP
Custom RP
URP
7. What is Universal Render Pipeline (URP)?
7
Beautiful Graphics
Performance
Platform Reach
Extensibility
8. What is Universal Render Pipeline (URP)?
8
Continuation of Lightweight Render Pipeline
Shaders, scripts and assemblies updated automatically
Manual upgrade required if you are using UsePass or Shader.Find
Universal Render Pipeline is going to be supported in 2019.4 LTS
10. Forward Renderer
10
— Increased Light limits
– 8 lights per-object. Except GLES2 which is 4.
– 256 per-camera. except in mobile which is 32.
— Additional directional lights are now supported.
36. Universal Render Pipeline | Design
Pipeline Callbacks.
● Can be used to
setup per-frame or
per-camera shader
data.
● Render Procedural
Cameras (Planar
Reflections)
37. Overridable by
implementing a custom
renderer.
Programmable Rendering Pipeline
public abstract class ScriptableRenderer
{
public abstract void Setup(ScriptableRenderContext
context, ref RenderingData renderingData);
public virtual void SetupCullingParameters(ref
ScriptableCullingParameters cullingParameters,
ref CameraData cameraData) {}
}
40. Build Rendering Data
Universal Render Pipeline | Design
Rendering Data
Lights
Target Output
Culling Results
Camera Data
Quality Settings
GPU Caps
Objects
Format Msaa Resolution
Shadows HSR
Post Stereo
44. Roadmap
54
— Give us feedback and vote on features:
– http://bit.ly/UniversalProductBoard
— Camera Stacking coming soon!
— 2020:
– Deferred Renderer
– Feature Parity with Built-in
— 2021:
– Growing scope with more high-end features
45. Thank you!
Universal Rendering Pipeline and the features
used to create Boat Attack
1pm - Auditorium 15
Upgrading your existing project to Universal
Render Pipeline
3pm - Auditorium 15
55