The document provides resources for learning how to develop VR games and experiences in Unreal Engine, including documentation, video tutorials, and examples from the developer community on locomotion techniques, reducing motion sickness, optimizing performance, and other best practices for VR development in UE4. It also gives an overview of key concepts like the engine framework and programming interaction using blueprints.
For Battlefield 3, DICE took on its most difficult challenge so far. To raise the bar for character quality in games we developed our own deformation rig, combined it with the powerful ANT animation system (used in FIFA) and extensive motion capture usage. To create a believable experience we built and managed enormous amount of assets and ways of keeping these organized. The rigging process was one of the most challenging aspects of production, with the smallest change requiring an update for almost every single asset. With a modular rigging system and a flexible animation pipeline the production team could deliver on time and quality.
Killzone Shadow Fall: Creating Art Tools For A New Generation Of GamesGuerrilla
This talk describes the tool improvements Guerrilla Games implemented to make Killzone Shadow Fall shine on the PlayStation 4. It highlights additions to the Maya pipeline, such as Viewport 2.0, Maya's coupling with in-game updates and in-engine deferred renderer features including real-time shadow-casting, volumetric lighting, hardware instancing, lens flares and color grading.
This session introduces the basics of lightmapping for beginners. Learn about some of the most common issues new users struggle with and how to solve them. Attendees will also see some of the new features planned for 2019.3.
Speakers: Jennifer Nordwall- Unity
Watch the session on YouTube: https://youtu.be/pcXpY_IBSeU
For Battlefield 3, DICE took on its most difficult challenge so far. To raise the bar for character quality in games we developed our own deformation rig, combined it with the powerful ANT animation system (used in FIFA) and extensive motion capture usage. To create a believable experience we built and managed enormous amount of assets and ways of keeping these organized. The rigging process was one of the most challenging aspects of production, with the smallest change requiring an update for almost every single asset. With a modular rigging system and a flexible animation pipeline the production team could deliver on time and quality.
Killzone Shadow Fall: Creating Art Tools For A New Generation Of GamesGuerrilla
This talk describes the tool improvements Guerrilla Games implemented to make Killzone Shadow Fall shine on the PlayStation 4. It highlights additions to the Maya pipeline, such as Viewport 2.0, Maya's coupling with in-game updates and in-engine deferred renderer features including real-time shadow-casting, volumetric lighting, hardware instancing, lens flares and color grading.
This session introduces the basics of lightmapping for beginners. Learn about some of the most common issues new users struggle with and how to solve them. Attendees will also see some of the new features planned for 2019.3.
Speakers: Jennifer Nordwall- Unity
Watch the session on YouTube: https://youtu.be/pcXpY_IBSeU
Player Traversal Mechanics in the Vast World of Horizon Zero DawnGuerrilla
Download the original PowerPoint presentation here: http://www.guerrilla-games.com/read/player-traversal-mechanics-in-the-vast-world-of-horizon-zero-dawn
Paul van Grinsven shows what is needed to make Aloy traverse the vast world of Horizon Zero Dawn, with its complex and organic environments. Various traversal mechanics are covered from a gameplay programmer's perspective, focusing on the interaction between code and animations. The different systems and techniques involved in the implementation of these mechanics are explained, and Van Grinsven looks at the underlying reasoning and design decisions.
講演動画はこちら:
https://youtu.be/BoUNuMJGHuc
講演者:
斎藤 修(Epic Games Japan)
https://twitter.com/shiba_zushi
本スライドは2021年7月25日に行われたオンライン勉強会「UE4 Character Art Dive Online」の講演資料となります。
イベントについてはこちら:
https://www.unrealengine.com/ja/blog/epicgamesjapan-onlinelearning-13
Putting the AI Back Into Air: Navigating the Air Space of Horizon Zero DawnGuerrilla
Download the full presentation here: http://www.guerrilla-games.com/read/putting-the-ai-back-into-air
Abstract: In this talk, we explain the technology behind the aerial navigation in Horizon Zero Dawn. In Horizon, we've represented the flyable air space by use of a run-time generated height map. Queries can be done on this height map for positional information and navigation. We present a hierarchical path planning algorithm for finding a progressively more detailed path between two points. Additionally, we will touch on some gameplay related subjects, to show the additional challenges we faced in implementing the different flying behaviors, such as transitioning from air to ground and guided crash-landing.
講演動画はこちら:
https://youtu.be/GEl8AfgI35g
講演者:
小林 浩之(Epic Games Japan)
https://twitter.com/hannover_bloss
本スライドは2021年7月25日に行われたオンライン勉強会「UE4 Character Art Dive Online」の講演資料となります。
イベントについてはこちら:
https://www.unrealengine.com/ja/blog/epicgamesjapan-onlinelearning-13
Next generation gaming brought high resolutions, very complex environments and large textures to our living rooms. With virtually every asset being inflated, it's hard to use traditional forward rendering and hope for rich, dynamic environments with extensive dynamic lighting. Deferred rendering, on the other hand, has been traditionally described as a nice technique for rendering of scenes with many dynamic lights, that unfortunately suffers from fill-rate problems and lack of anti-aliasing and very few games that use it were published.
In this talk, we will discuss our approach to face this challenge and how we designed a deferred rendering engine that uses multi-sampled anti-aliasing (MSAA). We will give in-depth description of each individual stage of our real-time rendering pipeline and the main ingredients of our lighting, post-processing and data management. We'll show how we utilize PS3's SPUs for fast rendering of a large set of primitives, parallel processing of geometry and computation of indirect lighting. We will also describe our optimizations of the lighting and our parallel split (cascaded) shadow map algorithm for faster and stable MSAA output.
Taking Killzone Shadow Fall Image Quality Into The Next GenerationGuerrilla
This talk focuses on the technical side of Killzone Shadow Fall, the platform exclusive launch title for PlayStation 4.
We present the details of several new techniques that were developed in the quest for next generation image quality, and the talk uses key locations from the game as examples. We discuss interesting aspects of the new content pipeline, next-gen lighting engine, usage of indirect lighting and various shadow rendering optimizations. We also describe the details of volumetric lighting, the real-time reflections system, and the new anti-aliasing solution, and include some details about the image-quality driven streaming system. A common, very important, theme of the talk is the temporal coherency and how it was utilized to reduce aliasing, and improve the rendering quality and image stability above the baseline 1080p resolution seen in other games.
Space Ape's Live Ops Stack: Engineering Mobile Games for Live Ops from Day 1Simon Hade
To view the accompanying video see http://links.spaceapegames.com/liveops
Around half of the $80m revenue generated by Space Ape’s three mid-core build and battle games is attributable to in game events. By adopting a flexible forward looking approach to tools development Space Ape efficiently operates their games with very small non-technical teams maintaining major weekly content update cycles.
In this talk, Space Ape’s senior Live Ops specialists give a demo of their tools and workflows and share the content strategies that have allowed them to grow revenues whilst enabling the studio to focus the majority of its development capacity on creating new games and IP.
DESIGNING SUCCESSFUL LIVE OPS SYSTEMS IN FREE TO PLAY GACHA ECONOMIES
Space Ape shipped Transformers:Earth Wars in the Summer pre-baked with the community events tools that had worked so well in their previous game, Rival Kingdoms. However, they soon realised that many of the old tricks did not apply to the game’s gacha collection economy which had more in common with Kabam’s Contest of Champions than the linear economies of most Build and Battle games. In this talk Space Ape’s Live Ops Lead Andrew Munden (formerly Live Ops Lead at Kabam) will share the content strategies that work in gacha collection games as well as how to build a manageable content furnace and balance player fatigue in a sustainable way.
A BRIEF HISTORY OF IN-GAME TARGETING.
Analytics lead Fred Easy (ex Betfair, Playfish/EA) will share the evolution of his offer targeting technology from it’s belt and braces beginnings to sophisticated value based targeting and the transition to a dynamic in-session machine learning approach.
UNDER THE HOOD: RIVAL KINGDOM'S CMS TOOLS
Game changing content is introduced to Rival Kingdoms every month, with in game events at least every week. Product Manager Mitchell Smallman (formerly Rovio, Next Games) and Steven Hsiao (competitive StarCraft player turned community manager turned Live Ops lead) will demonstrate the content management tools that allow them to keep the game fresh for players without developer support. This will include the tools for configuring competitive events, inserting new content into the game as well as how they measure performance of the changes and optimise on the fly. Learn how these tools enabled them to grow revenue for 6 consecutive months with no marketing spend.
To find out more about the developer go to www.spaceapegames.com
Epic Games Japan hold a meeting named "Lightmass Deep Dive" on July 30, 2016.
A Japanese architectural artist, Kenichi Makaya, created Casa Barragan on UE4. the architecture is a house of Mexican Architect, Luis Barragan. And he gave a presentation about making of the scene. .
CASA BARRAGAN Unreal Engine4
https://www.youtube.com/watch?v=Y7r28nO4iDU&feature=youtu.be
EGJ translated the slide for the presentation to English and published it.
Epic Games Japan hold a meeting named "Lightmass Deep Dive" on July 30, 2016.
Osamu Satio of Square Enix Osaka gave a presentation about their Lightmass Operation for Large Console Games. EGJ translated the slide for the presentation to English and published it.
There are some movies in the slide. So we recommend downloading this slide.
Player Traversal Mechanics in the Vast World of Horizon Zero DawnGuerrilla
Download the original PowerPoint presentation here: http://www.guerrilla-games.com/read/player-traversal-mechanics-in-the-vast-world-of-horizon-zero-dawn
Paul van Grinsven shows what is needed to make Aloy traverse the vast world of Horizon Zero Dawn, with its complex and organic environments. Various traversal mechanics are covered from a gameplay programmer's perspective, focusing on the interaction between code and animations. The different systems and techniques involved in the implementation of these mechanics are explained, and Van Grinsven looks at the underlying reasoning and design decisions.
講演動画はこちら:
https://youtu.be/BoUNuMJGHuc
講演者:
斎藤 修(Epic Games Japan)
https://twitter.com/shiba_zushi
本スライドは2021年7月25日に行われたオンライン勉強会「UE4 Character Art Dive Online」の講演資料となります。
イベントについてはこちら:
https://www.unrealengine.com/ja/blog/epicgamesjapan-onlinelearning-13
Putting the AI Back Into Air: Navigating the Air Space of Horizon Zero DawnGuerrilla
Download the full presentation here: http://www.guerrilla-games.com/read/putting-the-ai-back-into-air
Abstract: In this talk, we explain the technology behind the aerial navigation in Horizon Zero Dawn. In Horizon, we've represented the flyable air space by use of a run-time generated height map. Queries can be done on this height map for positional information and navigation. We present a hierarchical path planning algorithm for finding a progressively more detailed path between two points. Additionally, we will touch on some gameplay related subjects, to show the additional challenges we faced in implementing the different flying behaviors, such as transitioning from air to ground and guided crash-landing.
講演動画はこちら:
https://youtu.be/GEl8AfgI35g
講演者:
小林 浩之(Epic Games Japan)
https://twitter.com/hannover_bloss
本スライドは2021年7月25日に行われたオンライン勉強会「UE4 Character Art Dive Online」の講演資料となります。
イベントについてはこちら:
https://www.unrealengine.com/ja/blog/epicgamesjapan-onlinelearning-13
Next generation gaming brought high resolutions, very complex environments and large textures to our living rooms. With virtually every asset being inflated, it's hard to use traditional forward rendering and hope for rich, dynamic environments with extensive dynamic lighting. Deferred rendering, on the other hand, has been traditionally described as a nice technique for rendering of scenes with many dynamic lights, that unfortunately suffers from fill-rate problems and lack of anti-aliasing and very few games that use it were published.
In this talk, we will discuss our approach to face this challenge and how we designed a deferred rendering engine that uses multi-sampled anti-aliasing (MSAA). We will give in-depth description of each individual stage of our real-time rendering pipeline and the main ingredients of our lighting, post-processing and data management. We'll show how we utilize PS3's SPUs for fast rendering of a large set of primitives, parallel processing of geometry and computation of indirect lighting. We will also describe our optimizations of the lighting and our parallel split (cascaded) shadow map algorithm for faster and stable MSAA output.
Taking Killzone Shadow Fall Image Quality Into The Next GenerationGuerrilla
This talk focuses on the technical side of Killzone Shadow Fall, the platform exclusive launch title for PlayStation 4.
We present the details of several new techniques that were developed in the quest for next generation image quality, and the talk uses key locations from the game as examples. We discuss interesting aspects of the new content pipeline, next-gen lighting engine, usage of indirect lighting and various shadow rendering optimizations. We also describe the details of volumetric lighting, the real-time reflections system, and the new anti-aliasing solution, and include some details about the image-quality driven streaming system. A common, very important, theme of the talk is the temporal coherency and how it was utilized to reduce aliasing, and improve the rendering quality and image stability above the baseline 1080p resolution seen in other games.
Space Ape's Live Ops Stack: Engineering Mobile Games for Live Ops from Day 1Simon Hade
To view the accompanying video see http://links.spaceapegames.com/liveops
Around half of the $80m revenue generated by Space Ape’s three mid-core build and battle games is attributable to in game events. By adopting a flexible forward looking approach to tools development Space Ape efficiently operates their games with very small non-technical teams maintaining major weekly content update cycles.
In this talk, Space Ape’s senior Live Ops specialists give a demo of their tools and workflows and share the content strategies that have allowed them to grow revenues whilst enabling the studio to focus the majority of its development capacity on creating new games and IP.
DESIGNING SUCCESSFUL LIVE OPS SYSTEMS IN FREE TO PLAY GACHA ECONOMIES
Space Ape shipped Transformers:Earth Wars in the Summer pre-baked with the community events tools that had worked so well in their previous game, Rival Kingdoms. However, they soon realised that many of the old tricks did not apply to the game’s gacha collection economy which had more in common with Kabam’s Contest of Champions than the linear economies of most Build and Battle games. In this talk Space Ape’s Live Ops Lead Andrew Munden (formerly Live Ops Lead at Kabam) will share the content strategies that work in gacha collection games as well as how to build a manageable content furnace and balance player fatigue in a sustainable way.
A BRIEF HISTORY OF IN-GAME TARGETING.
Analytics lead Fred Easy (ex Betfair, Playfish/EA) will share the evolution of his offer targeting technology from it’s belt and braces beginnings to sophisticated value based targeting and the transition to a dynamic in-session machine learning approach.
UNDER THE HOOD: RIVAL KINGDOM'S CMS TOOLS
Game changing content is introduced to Rival Kingdoms every month, with in game events at least every week. Product Manager Mitchell Smallman (formerly Rovio, Next Games) and Steven Hsiao (competitive StarCraft player turned community manager turned Live Ops lead) will demonstrate the content management tools that allow them to keep the game fresh for players without developer support. This will include the tools for configuring competitive events, inserting new content into the game as well as how they measure performance of the changes and optimise on the fly. Learn how these tools enabled them to grow revenue for 6 consecutive months with no marketing spend.
To find out more about the developer go to www.spaceapegames.com
Epic Games Japan hold a meeting named "Lightmass Deep Dive" on July 30, 2016.
A Japanese architectural artist, Kenichi Makaya, created Casa Barragan on UE4. the architecture is a house of Mexican Architect, Luis Barragan. And he gave a presentation about making of the scene. .
CASA BARRAGAN Unreal Engine4
https://www.youtube.com/watch?v=Y7r28nO4iDU&feature=youtu.be
EGJ translated the slide for the presentation to English and published it.
Epic Games Japan hold a meeting named "Lightmass Deep Dive" on July 30, 2016.
Osamu Satio of Square Enix Osaka gave a presentation about their Lightmass Operation for Large Console Games. EGJ translated the slide for the presentation to English and published it.
There are some movies in the slide. So we recommend downloading this slide.
Luis cataldi unreal engine for educatorsLuis Cataldi
This is a deck from my presentation at the East Coast Game Conference in Raleigh NC. April 2015. The presentation focused on how students and educators can learn to utilize Unreal Engine to help achieve career goals in the game industry.
GDC Europe 2014: Unreal Engine 4 for Programmers - Lessons Learned & Things t...Gerke Max Preussner
A high-level overview of Unreal Engine 4, its game framework, the Slate user interface library, Unreal Motion Graphics, and Editor and Engine extensibility. Presented at GDC Europe in Cologne, Germany.
Also includes bonus slides on concurrency and parallelism features, general tips for programmers and Epic's build and automation infrastructure.
Overview of the basics of modules, plug-ins and projects in UE4, and a deep dive into integrating third-party dependencies. Presented at MIGS 2016 in Montreal.
Alexey Savchenko, Epic Games
This report will highlight the most recent changes in the version of the technology with reference to specific developments. We will take a look at best practices of development for PC, mobile and web games, also touch upon prospects for the development of applications and games for the VR.
How broadcasters can get in the VR game with sportsETCenter
With new distribution deals from the NFL on Twitter to ESPN on Sling how we watch TV is now driven by the consumer demand to do more while we watch tune in to watch our favorite team. Enter virtual reality. VR is the first truly transformative technology for sports broadcasting in years – to date, the biggest improvements we've seen have been HD (just better picture) and ""the yellow line."" With VR, we can actually take you to the game, like you're sitting courtside or on the 50year line, while still being able to check their Twitter, trash talk and follow their team in realtime.
Speakers : Saswat Panda, CTO, Livelikevr and Michael Davies, SVP Fox Sports
Virtual Reality gaming: analysis of Yon Paradox development - Fabio Mosca - C...Codemotion
Since the Kickstarter campaign in August 2012, Oculus Rift targeted games for Virtual Reality. "If a game works, then everything will work". Even though VR has been used a lot for business and industrial purposes, the community of VR and game developers pushed forward the platform and the limits of this new medium. In this talk I will analyze the process of design and development of Yon Paradox, sharing our results on what went good and what didn't work.
Virtual Training, Real Results: Exploring the Potential of VR in the WorkplaceAggregage
This webinar aims to educate attendees on the basics of VR technology, its applications across various industries, and its potential for transforming the way your employees learn, work, and interact.
Building the Matrix: Your First VR App (SVCC 2016)Liv Erickson
The slides from my talk, Building The Matrix: Your First VR App at Silicon Valley Code Camp, Oct. 2016. Development, design, and sample projects for virtual reality applications.
Hiren Bhinde (Qualcomm): On-device Motion Tracking for Immersive VRAugmentedWorldExpo
A talk from the Tools & Products Track at AWE USA 2017 - the largest conference for AR+VR in Santa Clara, California May 31- June 2, 2017.
Hiren Bhinde (Qualcomm): On-device Motion Tracking for Immersive VR
As the industry strives toward immersive augmented and virtual reality experiences, we are guided by the extreme requirements associated with intuitive interactions, visual, and sound quality, in order to achieve the ultimate untethered user experience. Precise, low-latency motion tracking of head movements is crucial for intuitive interactions with the virtual world, and visual-inertial odometry (VIO) is the ideal complementary subsystem to achieve this goal. VIO allows for six-degrees of freedom in VR/AR experiences, reduces latency and cuts the cord. In this session, developers will learn about the evolution of motion tracking and dive into six-degrees of freedom and its impact on VR/AR content development and user experiences.
http://AugmentedWorldExpo.com
Vision Summit 16 - Tips and Tricks for VR Game DevelopmentRafael Ferrari
After working with VR technology for more than one year straight, with Rococo VR and Finding Monsters Adventure VR, we are more then happy to share our tips and tricks regarding Gear VR game development. From programming to interface, come over and bring your questions!
Using intel's real sense to create games with natural user interfaces justi...BeMyApp
As technology advances, more sophisticated ways of interfacing with it are emerging. Even though new tech strives to make our apps more intuitive and easy to use, designing interfaces for those apps is not quite as straight forward. We’ve learned a few rules and “gotchas” when working with gesture cameras that can help to make apps that use them easy and fun to use.
In this talk Justin described:
1. Different data types you can get from Intel® RealSense™ and how to get them
2. Designing an interface for a gesture camera
3. Using your hands, face, and voice as an interface
Powering Next-Gen Learning with VR and xAPI - DevLearn 2018Margaret Roth
Virtual reality technologies have long been the promise of the future but just out of reach for the mainstream. Recent VR innovations, though, have allowed instructional designers and learning engineers to create and distribute custom VR content in ways that make VR a transformative part of training and learning programs across industries. When combined with xAPI, these futuristic technologies allow you to gain never-before-captured insights from next-gen digital experiences.
This session will take a look at how VR powered by xAPI is currently being used by instructional designers, learning engineers, and L&D professionals to gain new insights from next-gen learning experiences. You will explore case studies that demonstrate how VR interactions allow learners to explore and participate in engaging and intuitive 360-degree virtual environments designed to expand their vision and promote learning, impact, and retention. You will see case studies demonstrating how organizations are using xAPI-enabled VR content to enhance learning, from safety and compliance to onboarding and training.
* Originally presented on 10/26/18 at DevLearn 2018 with Margaret Roth, Mel Milloway and John Blackmon.
"Improving the VR experience, from the authors to the users"
Creating an immersive virtual reality application is a big challenge: choosing (or creating) the right hardware, choosing (or creating) the right software, and finally crafting the user experience. The hardware is increasingly powerful and accessible, but we don't know how to make the best of it. This is in part because designing a VR experience is a complex software task, and is also due to our limited understanding of the main component of the system: the user.
In this talk we will focus the current trends in system design, on the goals and design of MiddleVR, a generic VR plugin aimed at simplifying the creation of VR applications and we will discuss how our understanding of human perception can be used to improve the VR experience.
Tero Sarkkinen (Basemark) Latency Testing and Performance Optimization of VR ...AugmentedWorldExpo
Maintaining presence is a key for any VR application, be it a game or a business app. In this session, you will learn how to test for latency, what are the relevant components of performance and how to measure and optimize each of those.
Similar to Making VR games and experiences in Unreal Engine (20)
Instructions for Submissions thorugh G- Classroom.pptxJheel Barad
This presentation provides a briefing on how to upload submissions and documents in Google Classroom. It was prepared as part of an orientation for new Sainik School in-service teacher trainees. As a training officer, my goal is to ensure that you are comfortable and proficient with this essential tool for managing assignments and fostering student engagement.
Francesca Gottschalk - How can education support child empowerment.pptxEduSkills OECD
Francesca Gottschalk from the OECD’s Centre for Educational Research and Innovation presents at the Ask an Expert Webinar: How can education support child empowerment?
Synthetic Fiber Construction in lab .pptxPavel ( NSTU)
Synthetic fiber production is a fascinating and complex field that blends chemistry, engineering, and environmental science. By understanding these aspects, students can gain a comprehensive view of synthetic fiber production, its impact on society and the environment, and the potential for future innovations. Synthetic fibers play a crucial role in modern society, impacting various aspects of daily life, industry, and the environment. ynthetic fibers are integral to modern life, offering a range of benefits from cost-effectiveness and versatility to innovative applications and performance characteristics. While they pose environmental challenges, ongoing research and development aim to create more sustainable and eco-friendly alternatives. Understanding the importance of synthetic fibers helps in appreciating their role in the economy, industry, and daily life, while also emphasizing the need for sustainable practices and innovation.
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
A Strategic Approach: GenAI in EducationPeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
Embracing GenAI - A Strategic ImperativePeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
9. VR Learning Resources for Unreal Engine:
Docs:
• Getting Started With VR
• UE4 VR Index Page
• VR Best Practices
• VR Cheat Sheets
• Oculus Quick Starts
• GearVR Quick Starts
10. VR Learning Resources for Unreal Engine:
Video:
• Integrating the Oculus Rift into UE4
• UE4 Support Stream - Developing for VR
• 2015 UE4 - VR and Unreal Engine
• Unreal Engine 4 Training Streams
11. VR Learning Resources for Unreal Engine:
Presentations:
• Nick and Nick – Going Off the Rails: The Making of Bullet Train
• Lessons from Integrating the Oculus Rift into UE4
Links:
• Tom Looman’s - Getting Started with VR in Unreal Engine 4
• Sam Deiter - 10 VR tips for Unreal Engine
12. Education Community VR for UE4:
Mitchell McCaffrey’s - Mitch VR Labs
Mitch's VR Lab - an Introduction
Mitch's VR Lab - Look Based interaction
Mitch's VR Lab - Simple Teleportation Mechanic
Mitch's VR Lab - Introduction to SteamVR
Mitch's VR Lab - Simple Head IK
Mitch’s UE4 Forum Post
13. Education Community VR for UE4:
Carlos Coronado - VR Olive FPS Controller
Carlos’s UE4 Forum Post
Olive VR Locomotion: Movement
Olive VR Locomotion: Shooting
Olive VR Locomotion: Menus
Let’s take a look at Carlos’s Look Based Locomotion for Annie Amber
14. Before we get much deeper into Unreal
Engine...
What are Mitch, Carlos, (and we) solving for?
15. One of the biggest issues for working in
VR is Motion/Simulation Sickness.
17. en.wikipedia.org/wiki/Virtual_reality_sickness
Sensory conflict theory believes that sickness will occur
when a user's perception of self-motion is based on
incongruent sensory inputs from the visual
system,vestibular system, and non-
vestibular proprioceptors, and particularly so when these
inputs are at odds with the user's expectation based on
prior experience.
18. Five typical causes of Motion/Simulation Sickness in VR
Read more about it
1. Non-forward movements
• No unnatural movements
2. Awareness of Vection
• When a large part of the visual field moves, a viewer feels like
he has moved and that the world is stationary
3. The feeling of accelerations
4. Too much camera YAW
5. Helped by adding a static reference frame
19. Education Community Tips to Reduce
Motion/Simulation Sickness
Extra Credits - Simulation Sickness
Offpeak Games - 5 Design Techniques to Reduce Simulator Sickness
GDC - Designing to Minimize Simulation Sickness in VR Games
VR Best Practices, Eliminating Motion Sickness - Power of Play 2015
20. Education Community VR for UE4:
Mitchell McCaffrey’s - Mitch’s VR Game Template
Jun 2014 UE Forum Post of VR Game Templates
Space Shooter Template
First Person Template
22. UE4 VR Locomotion Techniques
Look -Based Locomotion/Interaction by Carlos Coronado
MIND: Path to Thalamus in the UE4 and VR.
Annie Amber
23. Things we CAN DO in Unreal Engine to
improve VR Games and Experiences
24. You MUST maintain framerate
For the VR experience to feel smooth, your game needs to run on 75 hz (Oculus DK2) or even 90
hz. (HTC Vive and Oculus CV1) depending on the device. To see the current framerate type in
“stat fps” or “stat unit” (for more detailed breakdown) in your console when running the game.
25. Use UE4’s VR Performance Profiling Tools
To capture a single frame with GPU timings press Ctrl+Shift+, or type in “profilegpu” in the
console. This command dumps accurate timings of the GPU, you will find that certain
processes are a heavy burden on the framerate (Ambient Occlusion is one common example)
when using VR.
The GPU Profiling & Performance and Profiling docs are a good place to learn about profiling
your game.
26. VR Instanced Stereo Can Help
The latest 4.11 release introduces Instanced Stereo Rendering, check the video below for a comparison video of
how that works.
“Basically, we’re utilizing hardware instancing to draw both eyes simultaneously with a single draw call and
pass through the render loop. This cuts down render thread CPU time significantly and also improves GPU
performance. Bullet Train was seeing ~15 – 20% CPU improvement on the render thread and ~7 – 10%
improvement on the GPU.” – Ryan Vance.
To enable this feature in 4.11
and above, go to your Project
Settings and look for
“Instanced Stereo” under the
Rendering category.
27. Disable Heavy Post-Processors
Due to the demanding requirements of VR many of the advanced Post Processing features that you normally
use should be disabled. To accomplish this you will need to do the following in your level.
•Add a Post Process(PP) volume to your level if there is not already one there.
•Select the PP volume and in the Post Process Volume section enable the Unbound option so that the settings in the PP volume will
be applied to the entire level.
•Expand the Settings of the Post Process Volume and then go through each section and disable any active PP settings by enabling
that property by clicking on it and then set the value from the default, usually 1.0, to 0 to disable the feature.
•When doing this you will not need to hit every section and set all the properties to 0. Instead first disable the really heavy hitting
features like Lens Flares, Screen Space reflections, Temporal AA, SSAO, and anything else that might have an impact on
performance.
•While a lot of the features are disabled by setting things in your .INI this ensures that nothing will happen to performance if
someone deletes the .INI by mistake.
28. Things to keep in at the front of your mind:
LOD's and aggressive culling are a must to ensure that you are
hitting your VR performance targets.
29. Known issues and possible workarounds:
Parallax Mapping
Parallax mapping takes Normal mapping to the next level by accounting for depth
cues, Normal mapping does not. A Parallax mapping shader can better display
depth information, making objects appear to have more detail than they do. This is
because no matter what angle you look at, a Parallax map will always correct itself
to show you the correct depth information from that view point. The best use of a
Parallax map would be for cobblestone pathways and fine detail on surfaces.
30. UE4 – Lighting for VR
Dimmer lights & colors can help reduce simulation sickness.
Use Static Lighting over Stationary or Dynamic.
Make sure your Stationary / Dynamic Lights do not overlap.
Baked lighting is the best option for VR.
If using Dynamic Shadows only have one shadowing light.
Use Stat LightRendering to see current lighting cost.
Profile, Profile, Profile
31. Fake shadows Wherever You Can!!
Using things like fake blob shadow drop to simulate dynamic shadows are a good
general rule in order to keep VR project running at frame.
Blob shadow example. Image by Eric Chadwick
32. UE4 – Effects for VR
Mesh based VFX work the best for VR.
Camera Facing particles do not hold up well in VR on their own.
The Dither Temporal AA Material Function can make Opacity masked objects look
like Translucent ones.
Local Space rotation does not look correct in VR.
33. UE4 – Environments for VR
Reflection probes instead of screen space reflections.
Again… Texture Blob shadows are a cheap alternative to dynamic shadows.
The ** Merge Actor Tool ** can help cut down on Static Mesh draw call
without having to do work outside of UE4.
36. The Unreal Engine Framework
GameInstance
GameMode
Pawn Class
HUD Class
PlayerController
Class
GameState
Class
PlayerState
Class
37. The Unreal Engine Framework
GameInstance
GameMode
Pawn Class
HUD Class
PlayerController
Class
GameState
Class
PlayerState
Class
The GameMode is the definition of
the game.
● It should include things like
the game rules and win
conditions.
● It also holds important
information about:
○ Pawn
○ PlayerContoller
○ GameState
○ PlayerState
38. The Unreal Engine Framework
GameInstance
GameMode
Pawn Class
HUD Class
PlayerController
Class
GameState
Class
PlayerState
Class
The Pawn class is the base class of
all Actors that can be controlled by
players or AI.
● The Pawn represents the
physical location, rotation,
etc. of a player or entity within
the game.
● A Character is a special type
of Pawn that has the ability to
walk around.
39. The Unreal Engine Framework
GameInstance
GameMode
Pawn Class
HUD Class
PlayerController
Class
GameState
Class
PlayerState
Class
A PlayerController is the interface
between the Pawn and the human
player controlling it.
● The PlayerController decides
what to do and then issues
commands to the Pawn (e.g.
"start crouching", "jump").
● Putting input handling or other
functionality into the
PlayerController is often
necessary.
● The PlayerController persists
throughout the game, while the
Pawn can be transient.
40. The Unreal Engine Framework
GameInstance
GameMode
Pawn Class
HUD Class
PlayerController
Class
GameState
Class
PlayerState
Class
The GameInstance is a class who’s
state persists switching of levels,
game modes, pawns etc. Where
classes like GameMode or
PlayerController are being reset
and data stored in those classes is
removed.
41. The Unreal Engine Framework
GameInstance
GameMode
Pawn Class
HUD Class
PlayerController
Class
GameState
Class
PlayerState
Class
The GameState contains the state
of the game, which could include
things like the list of connected
players, the score, where the
pieces are in a chess game, or the
list of what missions you have
completed in an open world game.
42. The Unreal Engine Framework
GameInstance
GameMode
Pawn Class
HUD Class
PlayerController
Class
GameState
Class
PlayerState
Class
A PlayerState is the state of a
participant in the game, such as a
human player or a bot that is
simulating a player. Non-player AI
that exists as part of the game
would not have a PlayerState.
43. The Unreal Engine Framework
GameInstance
GameMode
Pawn Class
HUD Class
PlayerController
Class
GameState
Class
PlayerState
Class
The HUD is the base object for
displaying elements overlaid on the
screen. Every human-controlled
player in the game has their own
instance of the AHUD class which
draws to their individual Viewport.
46. Actor
Pawn
Base building blocks in the Unreal Engine
Any object that can be placed into a level
Subclass of Actor and serve as an in-game avatar
47. Actor
Pawn
Character
Base building blocks in the Unreal Engine
Any object that can be placed into a level
Subclass of Actor and serve as an in-game avatar
Subclass of a Pawn that is intended
to be used as a player character
51. Programming VR Interaction with Blueprints
Blueprints in Unreal Engine is a complete visual scripting system based on the
concept of using a node-based interface to create interactions from within Unreal
Editor.
54. UE4 – Audio for VR
Ambient Sound Actors in VR
Ambient Sound Actor can be used for
many purposes such as ambient
looping sounds and non-looping
sounds. Generally, the Ambient
Sound Actor conforms to the real
world where the closer you are to a
sound, the louder it will appear.
55. UE4 – Audio for VR
Sound Properties
You can assign a sound asset from
the Details panel by selecting an
asset from the Sound settings drop-
down menu or by highlighting a sound
asset in the Content Browser and
clicking the button.
56. UE4 – Audio for VR
Attenuation Properties
Attenuation is the ability of a sound
to decrease in volume as the player
moves away from it.
It is advisable to use Sound
Attenuation objects whenever
possible, if for no other reason than
to give broad control over the
settings for many Actors.
57. UE4 – Audio for VR
New: Stereo Spatialization
3D spatialization is now possible for
stereo audio assets.
The 3D Stereo spread parameter
defines the distance in game units
between the left and right channels
and along a vector perpendicular to
the listener-emitter vector.
58. UE4 – Audio for VR
Audio Volume
Audio Volumes allow you to control
and apply various sounds in your
level as well as provide an avenue
to create compartmentalized audio
zones where you can control what is
heard inside and outside of the
volume.
59. Additional toolsets in Unreal Engine to enhance VR:
Complete state of the art suite of AI Tools.
60. Additional toolsets in Unreal Engine to enhance VR:
Complete set of tools for animation retargeting