This document provides best practices and recommendations for developing VR projects in Unreal Engine. It discusses proper world and character scaling, VR performance considerations like frame rate and profiling, and techniques for optimizing graphics like disabling post-processing effects and using static lighting. Specific tips are provided for areas like normal mapping, tessellation, lighting, particles, and instanced stereo rendering. Resources for learning VR development in Unreal like documentation, videos, and presentations are also listed.
How to implement realistic fabric material by Unreal engine?
This slider shows the way. You can make realistic and physically correct fabric shader by this method.
[IGC 2017] 펄어비스 민경인 - Mmorpg를 위한 voxel 기반 네비게이션 라이브러리 개발기강 민우
펄어비스의 MMORPG, 검은사막에 적용되어있는 AI 네비게이션 기능은 VOXEL 기반으로 자체 개발한 엔진을 이용해 구현되어 있습니다. 기존의 대다수 상용 라이브러리들이 네비 메쉬라고 하는 이동가능한 평면을 표현하는 폴리곤 기반의 데이터를 이용해 길찾기를 수행해주는 것에 비해 근간이 다릅니다. 이 강연에서는 검은사막의 네비게이션 엔진을 구현하고, 서버 / 클라이언트에 적용하면서 얻게된 노하우와 적용된 결과물들을 소개합니다.
COMP 4010 Course on Virtual and Augmented Reality. Lectures for 2017. Lecture 3: VR Input and Systems. Taught by Bruce Thomas on August 10th 2017 at the University of South Australia. Slides by Mark Billinghurst
Albion Online - Software Architecture of an MMO (talk at Quo Vadis 2016, Berlin)David Salz
Albion Online is a cross-platform sandbox MMO RPG game. This talk takes you behind the (technical) scenes. We will take a look at the structure of the server farm and its inner workings, the databases, the threading and message processing model and many other interesting implementation aspects. On the client side, Albion uses the well-known Unity game engine. The second part of the talk will describe how we use Unity (and which features we do not use, which is just as important!)
A lecture give on AR Tehchnology taught as part of the COMP 4010 course on AR/VR. This lecture was taught by Mark Billinghurst on August 10th 2021 at the University of South Australia.
How to implement realistic fabric material by Unreal engine?
This slider shows the way. You can make realistic and physically correct fabric shader by this method.
[IGC 2017] 펄어비스 민경인 - Mmorpg를 위한 voxel 기반 네비게이션 라이브러리 개발기강 민우
펄어비스의 MMORPG, 검은사막에 적용되어있는 AI 네비게이션 기능은 VOXEL 기반으로 자체 개발한 엔진을 이용해 구현되어 있습니다. 기존의 대다수 상용 라이브러리들이 네비 메쉬라고 하는 이동가능한 평면을 표현하는 폴리곤 기반의 데이터를 이용해 길찾기를 수행해주는 것에 비해 근간이 다릅니다. 이 강연에서는 검은사막의 네비게이션 엔진을 구현하고, 서버 / 클라이언트에 적용하면서 얻게된 노하우와 적용된 결과물들을 소개합니다.
COMP 4010 Course on Virtual and Augmented Reality. Lectures for 2017. Lecture 3: VR Input and Systems. Taught by Bruce Thomas on August 10th 2017 at the University of South Australia. Slides by Mark Billinghurst
Albion Online - Software Architecture of an MMO (talk at Quo Vadis 2016, Berlin)David Salz
Albion Online is a cross-platform sandbox MMO RPG game. This talk takes you behind the (technical) scenes. We will take a look at the structure of the server farm and its inner workings, the databases, the threading and message processing model and many other interesting implementation aspects. On the client side, Albion uses the well-known Unity game engine. The second part of the talk will describe how we use Unity (and which features we do not use, which is just as important!)
A lecture give on AR Tehchnology taught as part of the COMP 4010 course on AR/VR. This lecture was taught by Mark Billinghurst on August 10th 2021 at the University of South Australia.
Improve the performance of your Unity project using Graphics Performance Anal...Unity Technologies
This session will show you how to maximize your Unity game performance on a wide range of hardware. Learn how to use Intel Graphics Performance Analyzers (Intel GPA) to identify and quantify common performance bottlenecks, how to mitigate them, and how to validate optimizations. Using exciting new Intel GPA features, we will reveal how to gain deeper knowledge of the runtime execution of your game, easily identify problematic frames, and improve your game's overall performance.
Speaker: Valery Carpentier - Intel
Watch the session on YouTube: https://youtu.be/MzeOMK0xuac
Lecture 4 from the COMP 4010 course on AR/VR. This lecture reviews optical tracking for AR and starts discussion about interaction techniques. This was taught by Mark Billinghurst at the University of South Australia on August 17th 2021.
OIT to Volumetric Shadow Mapping, 101 Uses for Raster-Ordered Views using Dir...Gael Hofemeier
One of the new features of DirectX 12 is Raster-Ordered Views. This adds Ordering back into Unordered Access Views, removing race conditions within a pixel shader when multiple in-flight pixels write to the same XY screen coordinates. This allows algorithms that previously required link lists of pixel data to be efficiently processed in bounded memory. The talk shows how everything from Order Independent Transparency to Volumetric shadow mapping and even post processing can benefit from using Raster-Ordered Views to provide efficient and more importantly robust solutions suitable for real-time games. The session uses a mixture of real-world examples of where these algorithms have already been implemented in games and forward-looking research to show some of the exciting possibilities that open up with this new ability coming to DirectX.
Rendering Techniques in Rise of the Tomb RaiderEidos-Montréal
This cohesive overview of the advanced rendering techniques developed for Rise of the Tomb Raider presents a collection of diverse features, the challenges they presented, where current approaches succeed and fail, and solutions and implementation details.
Shader programming is typically an esoteric technological area for those that have started off programming game clients using Unity. Since Unity abstracts out the graphics pipeline into an easy to use visual editor, we rarely delve into the inner workings of the programmable GPU
Slides showing how to use Unity to build Google Cardboard Virtual Reality applications. From a series of lectures given by Mark Billinghurst from the University of South Australia.
Improve the performance of your Unity project using Graphics Performance Anal...Unity Technologies
This session will show you how to maximize your Unity game performance on a wide range of hardware. Learn how to use Intel Graphics Performance Analyzers (Intel GPA) to identify and quantify common performance bottlenecks, how to mitigate them, and how to validate optimizations. Using exciting new Intel GPA features, we will reveal how to gain deeper knowledge of the runtime execution of your game, easily identify problematic frames, and improve your game's overall performance.
Speaker: Valery Carpentier - Intel
Watch the session on YouTube: https://youtu.be/MzeOMK0xuac
Lecture 4 from the COMP 4010 course on AR/VR. This lecture reviews optical tracking for AR and starts discussion about interaction techniques. This was taught by Mark Billinghurst at the University of South Australia on August 17th 2021.
OIT to Volumetric Shadow Mapping, 101 Uses for Raster-Ordered Views using Dir...Gael Hofemeier
One of the new features of DirectX 12 is Raster-Ordered Views. This adds Ordering back into Unordered Access Views, removing race conditions within a pixel shader when multiple in-flight pixels write to the same XY screen coordinates. This allows algorithms that previously required link lists of pixel data to be efficiently processed in bounded memory. The talk shows how everything from Order Independent Transparency to Volumetric shadow mapping and even post processing can benefit from using Raster-Ordered Views to provide efficient and more importantly robust solutions suitable for real-time games. The session uses a mixture of real-world examples of where these algorithms have already been implemented in games and forward-looking research to show some of the exciting possibilities that open up with this new ability coming to DirectX.
Rendering Techniques in Rise of the Tomb RaiderEidos-Montréal
This cohesive overview of the advanced rendering techniques developed for Rise of the Tomb Raider presents a collection of diverse features, the challenges they presented, where current approaches succeed and fail, and solutions and implementation details.
Shader programming is typically an esoteric technological area for those that have started off programming game clients using Unity. Since Unity abstracts out the graphics pipeline into an easy to use visual editor, we rarely delve into the inner workings of the programmable GPU
Slides showing how to use Unity to build Google Cardboard Virtual Reality applications. From a series of lectures given by Mark Billinghurst from the University of South Australia.
As the industry strives toward immersive VR experiences, we are guided by the extreme requirements associated with intuitive interactions, visual quality, and sound quality, in order to achieve the ultimate mobile VR experience. Precise, low-latency motion tracking of head movements is crucial for intuitive interactions with the virtual world, and visual-inertial odometry (VIO) is the ideal complementary subsystem to achieve this goal. VIO allows for six-degrees of freedom (6 DoF) in VR experiences, reduces latency, and cuts the cord. In this presentation, you will learn about:
• The enhanced user experiences that 6 DoF provides over 3 DoF
• The evolution of motion tracking
• How Qualcomm’s on-device VIO implementation provides a precise head pose at a high frequency yet at low latency and power
• The impact of 6 DoF on VR content development
Hiren Bhinde (Qualcomm): On-device Motion Tracking for Immersive VRAugmentedWorldExpo
A talk from the Tools & Products Track at AWE USA 2017 - the largest conference for AR+VR in Santa Clara, California May 31- June 2, 2017.
Hiren Bhinde (Qualcomm): On-device Motion Tracking for Immersive VR
As the industry strives toward immersive augmented and virtual reality experiences, we are guided by the extreme requirements associated with intuitive interactions, visual, and sound quality, in order to achieve the ultimate untethered user experience. Precise, low-latency motion tracking of head movements is crucial for intuitive interactions with the virtual world, and visual-inertial odometry (VIO) is the ideal complementary subsystem to achieve this goal. VIO allows for six-degrees of freedom in VR/AR experiences, reduces latency and cuts the cord. In this session, developers will learn about the evolution of motion tracking and dive into six-degrees of freedom and its impact on VR/AR content development and user experiences.
http://AugmentedWorldExpo.com
XR graphics in Unity: delivering the best AR/VR experiences – Unite Copenhage...Unity Technologies
Virtual reality (VR) and augmented reality (AR) are powerful tools for storytelling, but poor execution can negatively impact consumer reactions and engagement. This session guides you through the latest Unity tech and best practices for creating stunning high-end VR and mobile AR visuals.
Speaker: Dan Miller – Unity
Watch the session on YouTube: https://youtu.be/dvOZ7IL2iOI
Essential Tips for Optimizing 3D Modelling and Animation.pptxvinzglobalsocial
Virtual Reality (VR) has exploded onto the scene, offering immersive experiences that blur the lines between the digital and physical worlds. However, smooth performance and seamless visuals are paramount to truly captivating users in VR. This is where 3D model optimization comes in. Streamlining your 3D assets ensures a flawless VR experience that keeps users engaged and avoids dreaded motion sickness.
Here at Vinz Global, a leading provider of 3D modeling and animation, we understand the importance of VR optimization. We've compiled this essential guide to equip you with the best practices for optimizing your 3D models for captivating VR experiences.
Virtual Reality gaming: analysis of Yon Paradox development - Fabio Mosca - C...Codemotion
Since the Kickstarter campaign in August 2012, Oculus Rift targeted games for Virtual Reality. "If a game works, then everything will work". Even though VR has been used a lot for business and industrial purposes, the community of VR and game developers pushed forward the platform and the limits of this new medium. In this talk I will analyze the process of design and development of Yon Paradox, sharing our results on what went good and what didn't work.
There is a lot of interest in Virtual Reality, but many people confuse it with 3D or AR (Augmented Reality). This presentation looks at the differences and surveys what's available in the market now.
The talk shows developers how to get the most out of Unity when developing VR under the unique challenges of mobile platforms. The presenter shares the experience of achieving a VR version of the Ice Cave mobile demo where a number of advanced highly optimized rendering effects are running simultaneously.
Luis cataldi unreal engine for educatorsLuis Cataldi
This is a deck from my presentation at the East Coast Game Conference in Raleigh NC. April 2015. The presentation focused on how students and educators can learn to utilize Unreal Engine to help achieve career goals in the game industry.
Honest Reviews of Tim Han LMA Course Program.pptxtimhan337
Personal development courses are widely available today, with each one promising life-changing outcomes. Tim Han’s Life Mastery Achievers (LMA) Course has drawn a lot of interest. In addition to offering my frank assessment of Success Insider’s LMA Course, this piece examines the course’s effects via a variety of Tim Han LMA course reviews and Success Insider comments.
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptxEduSkills OECD
Andreas Schleicher presents at the OECD webinar ‘Digital devices in schools: detrimental distraction or secret to success?’ on 27 May 2024. The presentation was based on findings from PISA 2022 results and the webinar helped launch the PISA in Focus ‘Managing screen time: How to protect and equip students against distraction’ https://www.oecd-ilibrary.org/education/managing-screen-time_7c225af4-en and the OECD Education Policy Perspective ‘Students, digital devices and success’ can be found here - https://oe.cd/il/5yV
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
Synthetic Fiber Construction in lab .pptxPavel ( NSTU)
Synthetic fiber production is a fascinating and complex field that blends chemistry, engineering, and environmental science. By understanding these aspects, students can gain a comprehensive view of synthetic fiber production, its impact on society and the environment, and the potential for future innovations. Synthetic fibers play a crucial role in modern society, impacting various aspects of daily life, industry, and the environment. ynthetic fibers are integral to modern life, offering a range of benefits from cost-effectiveness and versatility to innovative applications and performance characteristics. While they pose environmental challenges, ongoing research and development aim to create more sustainable and eco-friendly alternatives. Understanding the importance of synthetic fibers helps in appreciating their role in the economy, industry, and daily life, while also emphasizing the need for sustainable practices and innovation.
We all have good and bad thoughts from time to time and situation to situation. We are bombarded daily with spiraling thoughts(both negative and positive) creating all-consuming feel , making us difficult to manage with associated suffering. Good thoughts are like our Mob Signal (Positive thought) amidst noise(negative thought) in the atmosphere. Negative thoughts like noise outweigh positive thoughts. These thoughts often create unwanted confusion, trouble, stress and frustration in our mind as well as chaos in our physical world. Negative thoughts are also known as “distorted thinking”.
This is a presentation by Dada Robert in a Your Skill Boost masterclass organised by the Excellence Foundation for South Sudan (EFSS) on Saturday, the 25th and Sunday, the 26th of May 2024.
He discussed the concept of quality improvement, emphasizing its applicability to various aspects of life, including personal, project, and program improvements. He defined quality as doing the right thing at the right time in the right way to achieve the best possible results and discussed the concept of the "gap" between what we know and what we do, and how this gap represents the areas we need to improve. He explained the scientific approach to quality improvement, which involves systematic performance analysis, testing and learning, and implementing change ideas. He also highlighted the importance of client focus and a team approach to quality improvement.
The Art Pastor's Guide to Sabbath | Steve ThomasonSteve Thomason
What is the purpose of the Sabbath Law in the Torah. It is interesting to compare how the context of the law shifts from Exodus to Deuteronomy. Who gets to rest, and why?
How to Create Map Views in the Odoo 17 ERPCeline George
The map views are useful for providing a geographical representation of data. They allow users to visualize and analyze the data in a more intuitive manner.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
Basic Civil Engineering Notes of Chapter-6, Topic- Ecosystem, Biodiversity Green house effect & Hydrological cycle
Types of Ecosystem
(1) Natural Ecosystem
(2) Artificial Ecosystem
component of ecosystem
Biotic Components
Abiotic Components
Producers
Consumers
Decomposers
Functions of Ecosystem
Types of Biodiversity
Genetic Biodiversity
Species Biodiversity
Ecological Biodiversity
Importance of Biodiversity
Hydrological Cycle
Green House Effect
2. VR Learning Resources for Unreal Engine:
Docs:
▪ Getting Started With VR
▪ UE4 VR Index Page
▪ VR Best Practices
▪ VR Cheat Sheets
▪ Oculus Quick Starts
▪ GearVR Quick Starts
Video:
▪ Integrating the Oculus Rift into UE4
▪ UE4 Support Stream - Developing for VR
▪ 2015 UE4 - VR and Unreal Engine
▪ Unreal Engine 4 Training Stream: Up and Running with Gear VR
Presentations:
▪ UE4 VR - Niklas Smedberg
▪ Lessons from Integrating the Oculus Rift into UE4
Links:
▪ Getting Started with VR in Unreal Engine 4
2
3. Things to keep in at the front of your mind:
World Scale
Getting the scale of your world correct is one of the most important things to help deliver the
best user experience possible on VR platforms. Having the wrong scale can lead to all kind of
sensory issues for users and could even result in Simulation Sickness. Objects are most easily
viewed in VR when they are in a range of 0.75 to 3.5 Meters from the player's camera. Inside of
UE4, 1 Unreal Unit (UU) is equal to 1 Centimeter (CM). This means that object's inside of
Unreal are best viewed when they are 75 UU to 350 UU away from the player's camera when
using VR.
3
Distance Distance in Unreal Units(UU)
1 Centimeter 1 Unreal Unit
1 Meter 100 Unreal Units
1 Kilometer 100,000 Unreal Units
4. Things to keep in at the front of your mind:
World Scale
You can adjust the scale of your world in the World to Meters variable that is located
under World Settings in side of UE4. However exercise caution when adjusting the scale of your
world as again, selecting the wrong scale could lead to a disconnection between the world and
the user which could lead to simulation sickness.
4
5. Things to keep in at the front of your mind:
VR Character Settings
The setup for a character using a VR headset is slightly different than for a
standard character. Things like character Height, Width, Speed, and Camera
Location all need to be slightly modified to accommodate a VR character.
5
6. Things to keep in at the front of your mind:
VR Character Settings
Character Height & Width
Character Height & Width should mimic real life measurements as much as possible. Using sizes
that are too big or two small could ruin the emersion that you are trying to achieve.
6
Property UE4 Default Recommended VR
Height: 192 CM 176 CM
Width: 84 CM 68 CM
7. Things to keep in at the front of your mind:
VR Character Settings
Movement Speed
VR movement speed is a difficult property to recommend a setting for because the movement
speed that you choose will mainly be determined by the type of experience that you are trying
to achieve. In the Elemental VR demo for example, the movement speed was cut to about 1/4
normal speed.
7
Property UE4 Default Recommended VR
Movement Speed: 60 M/S 24 M/S
8. Things to keep in at the front of your mind:
VR Character Settings
Camera Location
The VR camera needs to be positioned slightly lower than the base eye height to compensate for
being at the characters eye level.
8
Property UE4 Default Recommended VR
Base Eye Height: 180 CM 160 CM
9. Things to keep in at the front of your mind:
Make sure your project is running at your HMD's target frame
rate before you build or add anything to your world.
9
10. Things to keep in at the front of your mind:
Check your performance constantly to ensure that you are
hitting your VR performance targets.
10
11. Things to keep in at the front of your mind:
○ Maintain a very simplistic approach to making your content.
○ Minimize complex shaders as best possible.
○ Add detail to the mesh within reason in lieu of relying of
complex shaders for surface details.
11
12. Things to keep in at the front of your mind:
LOD's and aggressive culling are a must to ensure that you are
hitting your VR performance targets.
12
13. Known issues and possible workarounds:
Screen Space Reflections(SSR)
SSR will work in VR but not give you the results that you want and instead you
should look into using reflection probes.
13
14. Known issues and possible workarounds:
Normal Mapping Issues
When viewing Normal maps on objects in VR, you will notice that they do not have
the impact that they might have once had. This is because normal mapping does
not account for a binocular display or motion parallax. Because of this, Normal
maps will come out looking flat when viewed with a VR device. To get around this,
you can do one of two things.
14
15. Known issues and possible workarounds:
Parallax Mapping
Parallax mapping takes Normal mapping to the next level by accounting for depth
cues, Normal mapping does not. A Parallax mapping shader can better display
depth information, making objects appear to have more detail than they do. This is
because no matter what angle you look at, a Parallax map will always correct itself
to show you the correct depth information from that view point. The best use of a
Parallax map would be for cobblestone pathways and fine detail on surfaces.
15
16. Known issues and possible workarounds:
Tessellation Shader Displacement
Tessellation Shader Displacement will displace 3D Geometry in real time by adding
details that are not modeled into the object. Tessellation shaders do a great job of
displaying information because tessellation shaders actually create the missing
detail by creating more vertices and displacing them in 3D Space.
16
17. New Project Settings:
When creating a new project for VR it is best to create a project that uses the Mobile / Tablet
setting with Scalable 3D or 2D and No Starter Content. If you need content you should port /
import only what you need and not everything you have.
17
18. Launching VR Preview:
Testing out your VR set is very straightforward, simply select “VR Preview” from the Play
dropdown button. By default the head tracking will work right away without any changes to your
existing project or template.
18
19. Using VR in Blueprint:
Using VR in Blueprint is very straightforward.
You will need a Camera Component and optionally one or two Motion Controllers Components. By default
your Camera is already set up for HMD support, if you wish to disable rotation changes from the HMD you
can disable “Lock to HMD” in the Component’s properties.
19
20. Performance Considerations:
For the VR experience to feel smooth, your game needs to run on 75 hz
(Oculus DK2) or even 90 hz. (HTC Vive and Oculus CV1) depending on the
device. To see the current framerate type in “stat fps” or “stat unit” (for
more detailed breakdown) in your console when running the game.
20
21. GPU Profiling:
To capture a single frame with GPU timings press Ctrl+Shift+, or type in “profilegpu” in the console.
This command dumps accurate timings of the GPU, you will find that certain processes are a heavy
burden on the framerate (Ambient Occlusion is one common example) when using VR.
The GPU Profiling & Performance and Profiling docs are a good place to learn about profiling your
game.
21
22. Instanced Stereo:
The latest 4.11 release introduces Instanced Stereo Rendering, check the video below for a
comparison video of how that works.
“Basically, we’re utilizing hardware instancing to draw both eyes simultaneously with a single draw
call and pass through the render loop. This cuts down render thread CPU time significantly and also
improves GPU performance. Bullet Train was seeing ~15 – 20% CPU improvement on the render
thread and ~7 – 10% improvement on the GPU.” – Ryan Vance.
22
To enable this feature in
4.11 and above, go to
your Project Settings
and look for “Instanced
Stereo” under the
Rendering category.
23. Disable Heavy Post-Processors:
Due to the demanding requirements of VR many of the advanced Post Processing features that you normally
use should be disabled. To accomplish this you will need to do the following in your level.
• Add a Post Process(PP) volume to your level if there is not already one there.
• Select the PP volume and in the Post Process Volume section enable the Unbound option so that the settings in the PP
volume will be applied to the entire level.
• Expand the Settings of the Post Process Volume and then go through each section and disable any active PP settings by
enabling that property by clicking on it and then set the value from the default, usually 1.0, to 0 to disable the feature.
• When doing this you will not need to hit every section and set all the properties to 0. Instead first disable the really
heavy hitting features like Lens Flares, Screen Space reflections, Temporal AA, SSAO, and anything else that might
have an impact on performance.
• While a lot of the features are disabled by setting things in your .INI this ensures that nothing will happen to performance if
someone deletes the .INI by mistake.
23
24. Static, Stationary and Dynamic Lighting:
You should always use Static lighting and lightmaps when making a VR project as this
is the cheapest option to render. If you need to use dynamic lighting make sure to
limit the amount of dynamic lights to as few as possible and make sure that they
never touch one another. If you have an outdoor scene set your directional light to
dynamic instead of stationary and then turn on Cascading Shadow Maps and set then
adjust the settings to be as low as possible while still giving you shadows. This is going
to take a lot of trial and error to get correct.
24
25. Fake shadows if you can:
Using things like fake blob shadow drop to simulate dynamic shadows are a good
general rule in order to keep VR project running at frame.
25
Blob shadow example. Image by Eric Chadwick
26. VFX in VR:
Some VFX techniques like using SubUV Textures to simulate fire or smoke
do not hold up well when viewed in VR. In many cases it can be more
desirable to use static meshes emitters instead of 2D sprite particles to
simulate VFX's like explosions or smoke trails. Near field effects, or effects
that happen very close to the camera can work well in VR but only when the
effects are made up of Static Meshes particles.
26
27. Again, VR Learning Resources for Unreal Engine:
Docs:
▪ Getting Started With VR
▪ UE4 VR Index Page
▪ VR Best Practices
▪ VR Cheat Sheets
▪ Oculus Quick Starts
▪ GearVR Quick Starts
Video:
▪ Integrating the Oculus Rift into UE4
▪ UE4 Support Stream - Developing for VR
▪ 2015 UE4 - VR and Unreal Engine
▪ Unreal Engine 4 Training Stream: Up and Running with Gear VR
Presentations:
▪ UE4 VR - Niklas Smedberg
▪ Lessons from Integrating the Oculus Rift into UE4
Links:
▪ Getting Started with VR in Unreal Engine 4
27
The info from the docs is old and this is a feature that now works but still does not look that good. I am going to re-write this on the UDN
This feature will only work in 4.11 or 4.12 as the VR team just got it working in VR.
If this is grayed out it means that UE4 is not seeing your HMD. The only way to fix this is to close down UE4 and make sure that your HMD is connected and working before you launch UE4.
You need to be careful when using the stat commands UE4 offers as they can not read or display what exactly is happening in the VR SDK. While the stat commands can be good for letting you know that there might be a problem, you really should use the profiling tools offered by the HMD manufacturers when you want to profile.