Virtual reality will provide the ultimate level of immersion, creating a sense of physical presence in real or imagined worlds.
This whitepaper describes:
• The unprecedented experiences and unlimited possibilities that VR enables.
• Why VR is happening now and how mobile technologies are accelerating VR adoption.
• The extreme requirements for immersive VR in terms of visual quality, sound quality, and intuitive interactions.
• Why Qualcomm Technologies is uniquely positioned to meet these extreme requirements and deliver superior mobile VR experiences
Learn more at: https://www.qualcomm.com/VR
Download the paper at: https://www.qualcomm.com/documents/whitepaper-making-immersive-virtual-reality-possible-mobile
Sign up for our mobile computing newsletter at: https://www.qualcomm.com/invention/technologies/mobile-computing/signup
Providers of live, linear, and VOD content strive to engage viewers on all of the most popular screens - and there will soon be two new device types to add to the must-reach list. Virtual reality (VR) devices like Samsung Gear VR, Oculus Rift, HTC Vive, Playstation VR, and Google Cardboard will be added first. A little further out, augmented reality (AR) devices like Microsoft HoloLens and Meta 2 will also belong on the list.
Virtual Reality Playbook for Marketing and PRStefan Spinnler
Virtual Reality is on the rise and will become part of marketing and PR strategies around the world. This Playbook offers ideas and guidelines for marketers and PR professionals on how to make VR work for them.
Virtual Reality for Training, Learning, Education and VisualisationDaden Limited
A presentation version of our Virtual Reality white paper taking a balanced look at the use of Virtual Reality in support of training, and some of the issues that need to be considered.
The way we will interact with digital content is about to rapidly change, due to the emergence of Consumer Virtual Reality.
!
This KZero Worldswide report explains the state of the Consumer Virtual Reality market, the devices being created, the companies operating in it, market size forecasts and commercial application examples for key Virtual Reality markets.
Virtual reality-What you see is what you believe kaishik gundu
The recent and the most famous technology cruising in the world and has got good applications in the modern world.This is a small Slide Show on the topic
Providers of live, linear, and VOD content strive to engage viewers on all of the most popular screens - and there will soon be two new device types to add to the must-reach list. Virtual reality (VR) devices like Samsung Gear VR, Oculus Rift, HTC Vive, Playstation VR, and Google Cardboard will be added first. A little further out, augmented reality (AR) devices like Microsoft HoloLens and Meta 2 will also belong on the list.
Virtual Reality Playbook for Marketing and PRStefan Spinnler
Virtual Reality is on the rise and will become part of marketing and PR strategies around the world. This Playbook offers ideas and guidelines for marketers and PR professionals on how to make VR work for them.
Virtual Reality for Training, Learning, Education and VisualisationDaden Limited
A presentation version of our Virtual Reality white paper taking a balanced look at the use of Virtual Reality in support of training, and some of the issues that need to be considered.
The way we will interact with digital content is about to rapidly change, due to the emergence of Consumer Virtual Reality.
!
This KZero Worldswide report explains the state of the Consumer Virtual Reality market, the devices being created, the companies operating in it, market size forecasts and commercial application examples for key Virtual Reality markets.
Virtual reality-What you see is what you believe kaishik gundu
The recent and the most famous technology cruising in the world and has got good applications in the modern world.This is a small Slide Show on the topic
Dummy's guide to Virtual Reality - Top 5 basic things you should know about VRVictoria Robertson
The Top 5 things every beginner should know about virtual reality.
VR is going mainstream and you should know the very basics if you don't want to seem like a dummy.
Falcon.io - The Future of Brand Experiences is Virtual RealityFalcon.io
A deep dive into the disruptive force of virtual reality. How will VR change the way we do business in the future? How can brands leverage the technology to own and personalize brand experiences? Here are the key takeaways from Falcon’s Director of Product Innovation, Mikael Lemberg’s as presented during Social Media Week Copenhagen 2017.
Collection of news, press releases, case studies, contributors, devices, diagrams, market statistics, and interpretations for virtual reality (VR) news as of Q1 2016
Location Based VR: VR Internet Cafe & VR Arcade | Jikhan JungJessica Tams
Delivered at Casual Connect USA 2016. VR has been increasing in popularity, especially HTC Vive which is proving to be a good VR experience with hand controllers and room scale tracking. However, cost and space requirements make it difficult for the normal household to have this. In a similar way that Internet Cafes became popular in Asia in early PC days, VR Internet Cafes and VR Arcades will play an important role in the popularity of VR in these early days.
This power point presentation is about the future technology.
Effect of virtual reality in todays world.
Here now we are gona show u whats gona be in our future.
With the advance of virtual reality technologies like HMD (head-mounted-displays) creatives together with UX/UI designers face today one of the most exciting moments one could ever ask for – the challenge of a new medium and the opportunity to create a range of symbology which will help design great immersive and engaging experiences.
Designing UI and UX for Interactive Virtual Reality AppsrapidBizApps
Learn how fantastic virtual reality experiences can be built with enhanced vertigo ergonomics, spectacular interactions and eliminating simulator sickness.
--
With the advancement in hardware and software available in the market for virtual reality, designing interactive user experiences optimized for various devices is a challenge. This eBook introduces you to various hardware and accessories available today for experiencing virtual reality. The book also explains the nuances of building user experiences for each with a case in point.
In The Pocket Academy: VR // The Past, Present & Future of VRIn The Pocket
Inspirational presentation, given at ITP Academy VR on May 10 2016, by Kenny Deriemaeker (@kderiemaeker) & Frederik De Bosschere (@vrederik).
Outlining the history of VR (all the way from the Renaissance to the 90's), its revival (Carmack & Luckey, the smartphone war dividend) and the different approaches by all of the main players today. Next, we highlight a few key industries where VR/AR is already happening. Finally, we make a few predictions for the future.
8 Technologies and Trends to Inspire Creativity and Enhance TeachingExtreme Networks
In today’s economy, it is critical for students to learn to be as creative and innovative as possible. Robots and artificial intelligence are rapidly taking over the routine, mechanical, and clerical jobs of the industrial age. New, emerging technologies are promoting creativity and enhancing teaching & learning in the classroom.
Eight technologies in particular have tremendous potential in education. They engage the students and spark creativity. Each is based on the sound, educational philosophy of Vygotsky and Bruner, but with a timely 21st century approach. This SlideShare discusses each of the eight technologies and includes video clips of the technology in use.
https://webapprestaurant.com/
Virtual reality is much essential in the gaming industry, and it’s nothing but a mix of cool and entertaining experiences to put on one of these headsets
Yode is a virtual, augmented and mixed reality production agency that creates revolutionary VR + AR solutions for businesses across multiple industries.
We provide full cycle of VR development process: whether it's design, 3D modeling, development or testing Virtual or Augmented Reality apps for any industries including advertising, education, exhibitions, healthcare and games.
ABI Research White Paper: Augmented and Virtual Reality: The First Wave of 5G...Qualcomm Research
White paper written by ABI Research and sponsored by Qualcomm to explore the specific requirements 5G has to deliver in order to support the next generation of VR and AR experiences.
Dummy's guide to Virtual Reality - Top 5 basic things you should know about VRVictoria Robertson
The Top 5 things every beginner should know about virtual reality.
VR is going mainstream and you should know the very basics if you don't want to seem like a dummy.
Falcon.io - The Future of Brand Experiences is Virtual RealityFalcon.io
A deep dive into the disruptive force of virtual reality. How will VR change the way we do business in the future? How can brands leverage the technology to own and personalize brand experiences? Here are the key takeaways from Falcon’s Director of Product Innovation, Mikael Lemberg’s as presented during Social Media Week Copenhagen 2017.
Collection of news, press releases, case studies, contributors, devices, diagrams, market statistics, and interpretations for virtual reality (VR) news as of Q1 2016
Location Based VR: VR Internet Cafe & VR Arcade | Jikhan JungJessica Tams
Delivered at Casual Connect USA 2016. VR has been increasing in popularity, especially HTC Vive which is proving to be a good VR experience with hand controllers and room scale tracking. However, cost and space requirements make it difficult for the normal household to have this. In a similar way that Internet Cafes became popular in Asia in early PC days, VR Internet Cafes and VR Arcades will play an important role in the popularity of VR in these early days.
This power point presentation is about the future technology.
Effect of virtual reality in todays world.
Here now we are gona show u whats gona be in our future.
With the advance of virtual reality technologies like HMD (head-mounted-displays) creatives together with UX/UI designers face today one of the most exciting moments one could ever ask for – the challenge of a new medium and the opportunity to create a range of symbology which will help design great immersive and engaging experiences.
Designing UI and UX for Interactive Virtual Reality AppsrapidBizApps
Learn how fantastic virtual reality experiences can be built with enhanced vertigo ergonomics, spectacular interactions and eliminating simulator sickness.
--
With the advancement in hardware and software available in the market for virtual reality, designing interactive user experiences optimized for various devices is a challenge. This eBook introduces you to various hardware and accessories available today for experiencing virtual reality. The book also explains the nuances of building user experiences for each with a case in point.
In The Pocket Academy: VR // The Past, Present & Future of VRIn The Pocket
Inspirational presentation, given at ITP Academy VR on May 10 2016, by Kenny Deriemaeker (@kderiemaeker) & Frederik De Bosschere (@vrederik).
Outlining the history of VR (all the way from the Renaissance to the 90's), its revival (Carmack & Luckey, the smartphone war dividend) and the different approaches by all of the main players today. Next, we highlight a few key industries where VR/AR is already happening. Finally, we make a few predictions for the future.
8 Technologies and Trends to Inspire Creativity and Enhance TeachingExtreme Networks
In today’s economy, it is critical for students to learn to be as creative and innovative as possible. Robots and artificial intelligence are rapidly taking over the routine, mechanical, and clerical jobs of the industrial age. New, emerging technologies are promoting creativity and enhancing teaching & learning in the classroom.
Eight technologies in particular have tremendous potential in education. They engage the students and spark creativity. Each is based on the sound, educational philosophy of Vygotsky and Bruner, but with a timely 21st century approach. This SlideShare discusses each of the eight technologies and includes video clips of the technology in use.
https://webapprestaurant.com/
Virtual reality is much essential in the gaming industry, and it’s nothing but a mix of cool and entertaining experiences to put on one of these headsets
Yode is a virtual, augmented and mixed reality production agency that creates revolutionary VR + AR solutions for businesses across multiple industries.
We provide full cycle of VR development process: whether it's design, 3D modeling, development or testing Virtual or Augmented Reality apps for any industries including advertising, education, exhibitions, healthcare and games.
ABI Research White Paper: Augmented and Virtual Reality: The First Wave of 5G...Qualcomm Research
White paper written by ABI Research and sponsored by Qualcomm to explore the specific requirements 5G has to deliver in order to support the next generation of VR and AR experiences.
MulteFire is a new LTE-based air-interface that is being developed to operate solely in unlicensed spectrum, enabling it to offer the best of both worlds: LTE-like performance with Wi-Fi-like deployment simplicity.
MulteFire will broaden the LTE ecosystem with new deployment scenarios, such as enhanced broadband services and neutral hosts benefiting operators to augment wireless services. MulteFire applies to any unlicensed or shared spectrum when over-the-air contention is needed (listen before talk), such as the global 5 GHz band or the new 3.5 GHz band in the USA. The combination of neutral spectrum with high performing LTE and self-organizing networks will enable neutral host small-cells in more locations.
Introduction to LTE Advanced Pro. LTE Advanced Pro is a rich roadmap of technologies that will be introduced as part of the global 3GPP standard starting with Release 13 and beyond.
Progress on LAA and its relationship to LTE-U and MulteFireQualcomm Research
Licensed Assisted Access (LAA) is introduced in 3GPP release 13 as part of LTE Advanced Pro. It uses carrier aggregation in the downlink to combine LTE in unlicensed spectrum (5 GHz) with LTE in the licensed band.
Shared/unlicensed spectrum is important for 5G and is valuable for wide range of deployments from extreme bandwidth by aggregating spectrum, enhanced local broadband to Internet of Things verticals. 5G New Radio (NR) will natively support all different spectrum types and is designed to take advantage of new sharing paradigms. We are pioneering 5G shared spectrum today by building on LTE-U/LAA, LWA, CBRS/LSA and MulteFire.
Full immersion is achieved by simultaneously focusing on the broader dimensions of visual quality, sound quality, and intuitive interactions. This presentation discusses how:
- Technology improvements continue to drive more immersive experiences, especially for VR and AR
- High Dynamic Range (HDR) will enhance the visual quality on all our screens
- Scene-based audio is a new paradigm for 3D audio
- Natural user interfaces like voice, gestures, and eye tracking are making interactions more intuitive
SRG WhitePaper: The prospect of LTE and Wi-Fi sharing unlicensed spectrumQualcomm Research
White Paper by Signals Research: The prospect of LTE and Wi-Fi sharing unlicensed spectrum. Learn more at www.qualcomm.com/invention/technologies/lte/unlicensed
A comprehensive analysis of the applications, use cases, and business considerations of LTE Broadcast from network operators, industry analysts and enterprise users perspective. To download, please visit: https://www.qualcomm.com/media/documents/files/lte-broadcast-white-paper-by-idc.pdf
Presentation describing how full immersion is achieved by simultaneously focusing on the broader dimensions of visual quality, sound quality, and intuitive interactions. The optimal way to enhance these broader dimensions requires an end-to-end approach, heterogeneous computing, and utilizing cognitive technologies.
Check out the immersive experiences website for the latest information: https://www.qualcomm.com/invention/cognitive-technologies/immersive-experiences
Download the presentation at: https://www.qualcomm.com/documents/immersive-experiences-presentation
Virtual reality (VR) is a simulated experience that employs pose tracking and 3D near-eye displays to give the user an immersive feel of a virtual world. Applications of virtual reality include entertainment (particularly video games), education (such as medical or military training) and business (such as virtual meetings). Other distinct types of VR-style technology include augmented reality and mixed reality, sometimes referred to as extended reality or XR, although definitions are currently changing due to the nascence of the industry. urrently, standard virtual reality systems use either virtual reality headsets or multi-projected environments to generate realistic images, sounds and other sensations that simulate a user's physical presence in a virtual environment. A person using virtual reality equipment is able to look around the artificial world, move around in it, and interact with virtual features or items. The effect is commonly created by VR headsets consisting of a head-mounted display with a small screen in front of the eyes, but can also be created through specially designed rooms with multiple large screens. Virtual reality typically incorporates auditory and video feedback, but may also allow other types of sensory and force feedback through haptic technology. "Virtual" has had the meaning of "being something in essence or effect, though not actually or in fact" since the mid-1400s.[2] The term "virtual" has been used in the computer sense of "not physically existing but made to appear by software" since 1959.[2]
In 1938, French avant-garde playwright Antonin Artaud described the illusory nature of characters and objects in the theatre as "la réalité virtuelle" in a collection of essays, Le Théâtre et son double. The English translation of this book, published in 1958 as The Theater and its Double,[3] is the earliest published use of the term "virtual reality". The term "artificial reality", coined by Myron Krueger, has been in use since the 1970s. The term "virtual reality" was first used in a science fiction context in The Judas Mandala, a 1982 novel by Damien Broderick.
Widespread adoption of the term "virtual reality" in the popular media is attributed to Jaron Lanier, who in the late 1980s designed some of the first business-grade virtual reality hardware under his firm VPL Research, and the 1992 film Lawnmower Man, which features use of virtual reality systems.[4] One method by which virtual reality can be realized is simulation-based virtual reality. Driving simulators, for example, give the driver on board the impression of actually driving an actual vehicle by predicting vehicular motion caused by driver input and feeding back corresponding visual, motion and audio cues to the driver.
With avatar image-based virtual reality, people can join the virtual environment in the form of real video as well as an avatar. One can participate in the 3D distributed virtual environment as form of either a conventional avatar.
thank
The Future of Play in a Gamified World: Webinar Presentation with transcriptGavin Gordon
As the digital experience of play and games becomes increasingly sophisticated and immersive, how does this alter consumers' perception and expectation of media?
Leading global consumer trend company Foresight Factory broadcast this webinar on Aug 25, 2016, looking at:
1. How will immersion in digital experiences - in commercial contexts too - change the consumer's relationship with media content?
2. What are the new and future expectations of brands across sectors such as retail? education? Health? And those in the game space?
3. Consumers want easier, more efficient ways to better themselves. Can brands position themselves as life-hacking partners - helping the consumer play their way to a better self?
Foresight Factory research points to a huge demand for learning within the context of entertainment; does this follow with a world of gamification?
Virtual Reality is a newly introduced technology that allows to replace the real word with synthetic one .It makes people believe that they are in another real.
“Augmented Reality or Computer-Mediated Reality is nothing but the extension of existing reality in real time with the help of computer software’s or programs which helps the user to better interact with it.”
Augment Reality is set to dominate the next age of retail through product visualization. AR gonna take a little while, because there’s some really hard technology challenges there.
But it will happen. It will happen in a big way. And we will wonder, when it does [happen], how we lived without it. Kind of how we wonder how we lived without our [smartphones] today.
I recently had the pleasure of experiencing VRTrek, a virtual reality (VR) system created by the talented team at syedhaseeb261. As an avid enthusiast of virtual reality, I was eager to explore the capabilities of this new offering. After spending several hours diving into various virtual worlds and experiences, I can confidently say that VRTrek provides an exceptional and immersive journey.
First and foremost, the hardware itself is impressive. The headset is comfortable to wear, with a sleek design that doesn't compromise on functionality. The visuals are stunning, delivering crisp and vibrant graphics that bring virtual environments to life. The high-resolution display ensures that every detail is rendered with precision, enhancing the overall sense of realism.
In this slide, we have discussed both virtual reality and augmented reality. We have also discussed their definition, evolution through time, types, how they work, their contribution in different sectors like medicine, treatment, and education, their application, limitations, and their future.
Virtual Reality (VR) Continuum - AMP New VenturesAMP New Ventures
If the Internet is the sharing of information, then Virtual Reality (VR) is the sharing of experiences; and if most customer experiences are digital, then Virtual Reality (VR) must be important, for it is the next frontier in digital.
VR immerses users in indistinguishably real simulated environments, while Augmented Reality (AR) blends the digital into our physical environments. In the past month, PlayStation VR was released along with Google VR, to join a global ecosystem of VR content, infrastructure and platforms startups, projected to be worth $160bn by 2020.
Given It will transform experiences across industries, including Financial Services, and the expert consensus is that mainstream adoption is ~5 years away, we recommend Financial Services companies start exploring VR/AR possibilities now.
VR (Virtual Reality) is a technology that has transformed the way we interact with digital environments. In a nutshell, virtual reality provides a simulated experience that can be similar to or completely different from reality. It allows users to enter a computer-generated 3D environment and feel as if they are in a different world through various sensory stimuli. Since its inception, virtual reality has come a long way and is increasingly finding applications in fields such as gaming, education, healthcare, and even therapy. This article delves into the world of virtual reality, including its history, current applications, and potential future impact.
I. A Glimpse into the History of Virtual Reality
The Evolution of VR: From Humble Beginnings to Global Phenomenon
Though virtual reality may appear to be a recent innovation, it has a long history dating back to the mid-20th century. It all started with Morton Heilig's Sensorama, a machine he created in the 1950s. Sensorama aimed to provide the user with a multisensory cinematic experience. This early attempt laid the groundwork for what we now call virtual reality.
The term "virtual reality" was coined in the 1980s by Jaron Lanier, who founded VPL Research. Lanier and his team created the first VR goggles and data gloves, which allowed users to immerse themselves in and interact with virtual worlds. Since then, VR technology has advanced significantly, with significant contributions from companies such as Oculus and HTC.
II. The Mechanics of Virtual Reality
How VR Works: Creating the Illusion of Reality
At its core, VR is based on the combination of several technologies to create the convincing illusion of being in a different location or environment. This is made possible by four major components:
1. Head-Mounted Display (HMD): The most recognizable component of virtual reality is the VR headset, also known as an HMD. It has a screen for each eye that displays the virtual 3D environment. The HMD is worn on the user's head, allowing them to look around and feel as if they are inside the virtual world.
2. Tracking Sensors: These sensors are in charge of tracking the user's movements. They monitor the position and orientation of the head, as well as the position of any handheld controllers. This information is critical for real-time visual updates, ensuring that the virtual environment responds to the user's actions.
3. Audio Systems: Immersive audio is critical to making VR believable. Sound directionality is replicated by 3D audio systems, making it appear as if sounds are coming from specific locations within the virtual environment. This improves the overall feeling of presence.
4. Handheld Controllers: Handheld controllers are input devices that enable users to interact with the virtual environment. In a game, for example, they could act as virtual hands or tools, allowing users to manipulate objects and interact with the VR world.
Generative AI models, such as ChatGPT and Stable Diffusion, can create new and original content like text, images, video, audio, or other data from simple prompts, as well as handle complex dialogs and reason about problems with or without images. These models are disrupting traditional technologies, from search and content creation to automation and problem solving, and are fundamentally shaping the future user interface to computing devices. Generative AI can apply broadly across industries, providing significant enhancements for utility, productivity, and entertainment. As generative AI adoption grows at record-setting speeds and computing demands increase, on-device and hybrid processing are more important than ever. Just like traditional computing evolved from mainframes to today’s mix of cloud and edge devices, AI processing will be distributed between them for AI to scale and reach its full potential.
In this presentation you’ll learn about:
- Why on-device AI is key
- Full-stack AI optimizations to make on-device AI possible and efficient
- Advanced techniques like quantization, distillation, and speculative decoding
- How generative AI models can be run on device and examples of some running now
- Qualcomm Technologies’ role in scaling on-device generative AI
As generative AI adoption grows at record-setting speeds and computing demands increase, hybrid processing is more important than ever. But just like traditional computing evolved from mainframes and thin clients to today’s mix of cloud and edge devices, AI processing must be distributed between the cloud and devices for AI to scale and reach its full potential. In this talk you’ll learn:
• Why on-device AI is key
• Which generative AI models can run on device
• Why the future of AI is hybrid
• Qualcomm Technologies’ role in making hybrid AI a reality
Qualcomm Webinar: Solving Unsolvable Combinatorial Problems with AIQualcomm Research
How do you find the best solution when faced with many choices? Combinatorial optimization is a field of mathematics that seeks to find the most optimal solutions for complex problems involving multiple variables. There are numerous business verticals that can benefit from combinatorial optimization, whether transport, supply chain, or the mobile industry.
More recently, we’ve seen gains from AI for combinatorial optimization, leading to scalability of the method, as well as significant reductions in cost. This method replaces the manual tuning of traditional heuristic approaches with an AI agent that provides a fast metric estimation.
In this presentation you will find out:
Why AI is crucial in combinatorial optimization
How it can be applied to two use cases: improving chip design and hardware-specific compilers
The state-of-the-art results achieved by Qualcomm AI Research
- There is a rich roadmap of 5G technologies coming in the second half of the 5G decade with the 5G Advanced evolution
- 6G will be the future innovation platform for 2030 and beyond building on the 5G Advanced foundation
- 6G will be more than just a new radio design, expanding the role of AI, sensing and others in the connected intelligent edge
- Qualcomm is leading cutting-edge wireless research across six key technology vectors on the path to 6G
3D perception is crucial for understanding the real world. It offers many benefits and new capabilities over 2D across diverse applications, from XR and autonomous driving to IOT, camera, and mobile. 3D perception with machine learning is creating the new state of the art (SOTA) in areas, such as depth estimation, object detection, and neural scene representation. Making these SOTA neural networks feasible for real-world deployment on mobile devices constrained by power, thermal, and performance has been a challenge. Qualcomm AI Research has developed not only novel AI techniques for 3D perception but also full-stack AI optimizations to enable real-world deployments and energy-efficient solutions. This presentation explores the latest research that is enabling efficient 3D perception while maintaining neural network model accuracy. You’ll learn about:
- The advantages of 3D perception over 2D and the need for 3D perception across applications
- Advancements in 3D perception research by Qualcomm AI Research
- Our future 3D perception research directions
5G is going mainstream across the globe, and this is an exciting time to harness the low latency and high capacity of 5G to enable the metaverse. A distributed-compute architecture across device and cloud can enable rich extended reality (XR) user experiences. Virtual reality (VR) and mixed reality (MR) are ready for deployment in private networks, while augmented reality (AR) for wide area networks can be enabled in the near term with Wi-Fi powered AR glasses paired with a 5G-enabled phone. Device APIs enabling application adaptation is critical for good user experience. 5G standards are evolving to support the deployment of AR glasses at a large scale and setting the stage for 6G-era with the merging of the physical, digital, and virtual worlds. Techniques like perception-enhanced wireless offer significant potential to improve user experience. Qualcomm Technologies is enabling the XR industry with platforms, developer SDKs, and reference designs.
Check out this webinar to learn:
• How 5G and distributed-compute architectures enable the metaverse
• The latest results from our boundless XR 5G/6G testbed, including device APIs and perception-enhanced wireless
• 5G standards evolution for enhancing XR applications and the road to 6G
• How Qualcomm Technologies is enabling the industry with platforms, SDKs, and reference designs
AI model efficiency is crucial for making AI ubiquitous, leading to smarter devices and enhanced lives. Besides the performance benefit, quantized neural networks also increase power efficiency for two reasons: reduced memory access costs and increased compute efficiency.
The quantization work done by the Qualcomm AI Research team is crucial in implementing machine learning algorithms on low-power edge devices. In network quantization, we focus on both pushing the state-of-the-art (SOTA) in compression and making quantized inference as easy to access as possible. For example, our SOTA work on oscillations in quantization-aware training that push the boundaries of what is possible with INT4 quantization. Furthermore, for ease of deployment, the integer formats such as INT16 and INT8 give comparable performance to floating point, i.e., FP16 and FP8, but have significantly better performance-per-watt performance. Researchers and developers can make use of this quantization research to successfully optimize and deploy their models across devices with open-sourced tools like AI Model Efficiency Toolkit (AIMET).
Presenters: Tijmen Blankevoort and Chirag Patel
Bringing AI research to wireless communication and sensingQualcomm Research
AI for wireless is already here, with applications in areas such as mobility management, sensing and localization, smart signaling and interference management. Recently, Qualcomm Technologies has prototyped the AI-enabled air interface and launched the Qualcomm 5G AI Suite. These developments are possible thanks to expertise in both wireless and machine learning from over a decade of foundational research in these complementing fields.
Our approach brings together the modeling flexibility and computational efficiency of machine learning and the out-of-domain generalization and interpretability of wireless domain expertise.
In this webinar, Qualcomm AI Research presents an overview of state-of-the-art research at the intersection of the two fields and offers a glimpse into the future of the wireless industry.
Qualcomm AI Research is an initiative of Qualcomm Technologies, Inc.
Speakers:
Arash Behboodi, Machine Learning Research Scientist (Senior Staff Engineer/Manager), Qualcomm AI Research Daniel Dijkman, Machine Learning Research Scientist (Principal Engineer), Qualcomm AI Research
How will sidelink bring a new level of 5G versatility.pdfQualcomm Research
Today, the 5G system mainly operates on a network-to-device communication model, exemplified by enhanced mobile broadband use cases where all data transmissions are between the network (i.e., base station) and devices (e.g., smartphone). However, to fully deliver on the original 5G vision of supporting diverse devices, services, and deployment scenarios, we need to expand the 5G topology further to reach new levels of performance and efficiency.
That is why sidelink communication was introduced in 3GPP standards, designed to facilitate direct communication between devices, independent of connectivity via the cellular infrastructure. Beyond automotive communication, it also benefits many other 5G use cases such as IoT, mobile broadband, and public safety.
5G is designed to serve an unprecedented range of capabilities with a single global standard. With enhanced mobile broadband (eMBB), massive IoT (mIoT), and mission-critical IoT, the three pillars of 5G represent extremes in performance and associated complexity. For IoT services, NB-IoT and eMTC devices prioritize low power consumption and the lowest complexity for wide-area deployments (LPWA), while enhanced ultra-reliable, low-latency communication (eURLLC), along with time-sensitive networking (TSN), delivers the most stringent use case requirements. But there exists an opportunity to more efficiently address a broad range of mid-tier applications with capabilities ranging between these extremes.
In 5G NR Release 17, 3GPP introduced a new tier of reduced capability (RedCap) devices, also known as NR-Light. It is a new device platform that bridges the capability and complexity gap between the extremes in 5G today with an optimized design for mid-tier use cases. With the recent standards completion, NR-Light is set to efficiently expand the 5G universe to connect new frontiers.
Download this presentation to learn:
• What NR-Light is and why it can herald the next wave of 5G expansion
• How NR-Light is accelerating the growth of the connected intelligent edge
• Why NR-Light is a suitable 5G migration path for mid-tier LTE devices
Realizing mission-critical industrial automation with 5GQualcomm Research
Manufacturers seeking better operational efficiencies, with reduced downtime and higher yield, are at the leading edge of the Industry 4.0 transformation. With mobile system components and reliable wireless connectivity between them, flexible manufacturing systems can be reconfigured quickly for new tasks, to troubleshoot issues, or in response to shifts in supply and demand.
There is a long history of R&D collaboration between Bosch Rexroth and Qualcomm Technologies for the effective application of these 5G capabilities to industrial automation use cases. At the Robert Bosch Elektronik GmbH factory in Salzgitter, Germany, this collaboration has reached new heights.
Download this deck to learn how:
• Qualcomm Technologies and Bosch Rexroth are collaborating to accelerate the Industry 4.0 transformation
• 5G technologies deliver key capabilities for mission-critical industrial automation
• Distributed control solutions can work effectively across 5G TSN networks
• A single 5G technology platform solves connectivity and positioning needs for flexible manufacturing
3GPP Release 17: Completing the first phase of 5G evolutionQualcomm Research
This presentation summarizes 5G NR Release 17 projects that was completed in March 2022. It further enhances 5G foundation and expands into new devices, use cases, verticals.
AI firsts: Leading from research to proof-of-conceptQualcomm Research
AI has made tremendous progress over the past decade, with many advancements coming from fundamental research from many decades ago. Accelerating the pipeline from research to commercialization has been daunting since scaling technologies in the real world faces many challenges beyond the theoretical work done in the lab. Qualcomm AI Research has taken on the task of not only generating novel AI research but also being first to demonstrate proof-of-concepts on commercial devices, enabling technology to scale in the real world. This presentation covers:
The challenges of deploying cutting-edge research on real-world mobile devices
How Qualcomm AI Research is solving system and feasibility challenges with full-stack optimizations to quickly move from research to commercialization
Examples where Qualcomm AI Research has had industrial or academic firsts
Setting off the 5G Advanced evolution with 3GPP Release 18Qualcomm Research
In December 2021, 3GPP has reached a consensus on the scope of 5G NR Release 18. This is a significant milestone marking the beginning of 5G Advanced — the second wave of wireless innovations that will fulfill the 5G vision. Release 18 will build on the solid foundation set by Releases 15, 16, and 17, and it sets the longer-term evolution direction of 5G and beyond. This release will encompass a wide range of new and enhancement projects, ranging from improved MIMO and application of AI/ML-enabled air interface to extended reality optimizations and broader IoT support.
Cellular networks have facilitated positioning in addition to voice or data communications from the beginning, since 2G, and we’ve since grown to rely on positioning technology to make our lives safer, simpler, more productive, and even fun. Cellular positioning complements other technologies to operate indoors and outdoors, including dense urban environments where tall buildings interfere with satellite positioning. It works whether we’re standing still, walking, or in a moving vehicle. With 5G, cellular positioning breaks new ground to bring robust precise positioning indoors and outdoors, to meet even the most demanding Industry 4.0 needs.
As we look to the future, the Connected Intelligent Edge will bring a new dimension of positional insight to a broad range of devices, improving wireless use cases still under development. We’re already charting the course to 5G Advanced and beyond by working on the evolution of cellular positioning technology to include RF sensing for situational awareness.
Download the deck to learn more.
The need for intelligent, personalized experiences powered by AI is ever-growing. Our devices are producing more and more data that could help improve our AI experiences. How do we learn and efficiently process all this data from edge devices while maintaining privacy? On-device learning rather than cloud training can address these challenges. In this presentation, we’ll discuss:
- Why on-device learning is crucial for providing intelligent, personalized experiences without sacrificing privacy
- Our latest research in on-device learning, including few-shot learning, continuous learning, and federated learning
- How we are solving system and feasibility challenges to move from research to commercialization
This presentation outlines the synergistic nature of 5G and AI -- two disruptive areas of innovations that can change the world. It illustrates the benefits of adopting AI for the advancements of 5G, as well as showcases the latest progress made by Qualcomm Technologies, Inc.
Data compression has increased by leaps and bounds over the years due to technical innovation, enabling the proliferation of streamed digital multimedia and voice over IP. For example, a regular cadence of technical advancement in video codecs has led to massive reduction in file size – in fact, up to a 1000x reduction in file size when comparing a raw video file to a VVC encoded file. However, with the rise of machine learning techniques and diverse data types to compress, AI may be a compelling tool for next-generation compression, offering a variety of benefits over traditional techniques. In this presentation we discuss:
- Why the demand for improved data compression is growing
- Why AI is a compelling tool for compression in general
- Qualcomm AI Research’s latest AI voice and video codec research
- Our future AI codec research work and challenges
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
3. 3
Contents
1 Executive summary 4
2 VR offers unprecedented experiences 4
5
3 The time is right for VR 7
7
8
4 Truly immersive VR has extreme requirements 8
9
11
11
5 QTI is uniquely positioned for VR 13
15
6 Conclusion 16
2.1 A new paradigm for how we play, learn, and communicate
3.1 Ecosystem drivers and technology advancements
3.2 The mobile industry is accelerating VR adoption
4.1 Visual quality
4.2 Sound quality
4.3 Intuitive interactions
5.1 Snapdragon VR enhancements
4. 4
Immersion enhances everyday experiences, making them more realistic, engaging, and satisfying. Virtual reality (VR) will provide
the ultimate level of immersion, creating a sense of physical presence in real or imagined worlds. VR will bring a new paradigm for
how we can interact with the world, offering unprecedented experiences and unlimited possibilities that will enhance our lives in
many ways.
This promise of VR has excited us for decades, but it is happening now due to the alignment of ecosystem drivers and technology
advancements. In fact, it is mobile technologies that are accelerating VR adoption. To stimulate our human senses with realistic
feedback, truly immersive VR places extreme requirements on several dimensions of the three pillars of immersion1
—visual quality,
sound quality, and intuitive interactions. Adding to the complexity, mobile VR requires full immersion at low power and thermals so
that the headset is sleek, lightweight, and stylish while remaining cool.
Qualcomm Technologies, Inc. (QTI) is uniquely positioned to support superior mobile VR experiences by designing for these
extreme requirements. By taking a system approach and custom designing specialized engines across the SoC, Qualcomm®
Snapdragon™ processors are engineered to provide an efficient heterogeneous computing solution that is optimized from end-to-
end for VR.
1 Executive summary
Immersive experiences stimulate our senses—they draw us in, take us to another place, and keep us present in the moment. When
done right, VR provides the ultimate level of immersion. By definition, VR creates a sense of physical presence in real or imagined
worlds. For creating truly immersive experiences, VR places high emphasis on the three pillars of immersion to stimulate our
senses with realistic feedback, providing:
• Visuals so vibrant that they are eventually indistinguishable from the real world
• Sounds so accurate that they are true to life
• Interactions so intuitive that they become second nature and you forget there is even an interface
2 VR offers unprecedented experiences
Figure 1: Immersive VR stimulates our senses with realistic feedback
Interactions
Sounds
Visuals
1
https://www.qualcomm.com/documents/making-immersive-virtual-reality-possible-mobile
5. 5
By creating the sense of presence, VR will bring a new paradigm for how we interact with the world. VR will offer unprecedented
experiences and unlimited possibilities that allow us to live out our dreams on-demand. We’ll be able to be anywhere and do
anything, with anyone. Our imagination and creativity are the only limits, but they will be stretched. It’s truly an exciting time. VR will
impact every aspect of our lives, redefining how we play, learn, and communicate.
A new paradigm for how we play, learn, and communicate
Figure 2: VR offers unprecedented experiences and brings a new paradigm for how we interact with the world
2.1
Experiences in VR
Play Learn Communicate
Immersive
movies and shows
Live concerts, sports,
and other events
Interactive gaming
and entertainment
Immersive education
Training and demos
3D design and art
Social interactions
Shared personal
moments
Empathetic storytelling
How we play
Immersive movies, shows, and videos: Movies and shows will be brought to a whole new level with things like 3D 360° spherical
video capture and playback. The viewer will become the director and be able to look wherever they want, whether snowboarding,
sky diving, or walking on the moon.
Live concerts, sports, and other events: Imagine watching
sports from the best seat in the stadium, whether it
is basketball, football, the World Cup, or the Olympics
(Figure 3). Or even imagine having the point of view of a
player, such as a NBA player on a breakaway dunk.
Interactive gaming and entertainment: Video games
already create computer-generated, 360° virtual
environments that are amazing, but VR will take gaming
to the next level by making you feel like you are actually
a character inside the game.
Live sports
experience
Play
Figure 3: Sports fans enjoy a soccer game from the best seat
6. 6
How we learn and work
Immersive education: School education and classroom learning will be revolutionized when students are taught with the best
materials and lectures, which will keep them engaged. Students can take a field trip to Paris, the Great Wall of China, or the Grand
Canyon without ever leaving the classroom.
Training and demos: Training in many areas, such
as healthcare, military, manufacturing, and even
astronauts, will be more effective, more cost efficient,
and safer with VR. For example, doctors can train in
everything from surgical procedures to diagnosing a
patient (Figure 4).
3D design and art: Design and visualization will be
much more intuitive in VR. For example, car companies
can build virtual prototypes of new vehicles, testing
them thoroughly before producing a single physical
part.
Learn
Doctor training
experience
Figure 4: Doctors learn through realistic training
How we communicate and interact
Social interactions: Individuals can virtually meet, interact, and talk with one another while feeling as if they are physically located in
the same place. Family get-togethers and work meetings from distant locations will never be the same.
Shared personal moments: The people you care
about can virtually feel part of the moment,
whether it is your child’s first step, an amazing
vacation, or wedding ceremony (Figure 5).
Empathetic storytelling and immersive
visualization: When you become part of the story
and part of the scene, the storytelling can be
much more impactful, immersive, and empathetic.
For example, you can know what it is like to live in
a third-world country or to have a disability.
VR is about creating these immersive experiences. Furthermore, it is important to point out that virtual reality is not augmented
reality. Although they share similar underlying technologies, they offer distinct experiences. VR simulates physical presence in real
or imagined worlds, and enables the user to interact in that world. While AR superimposes content over the real world such that
the content appears to a viewer to be part of the real-world scene. Advancements in VR technologies will help make AR possible,
and vice versa.
Communicate
Recital experience
Figure 5: Grandparents feel like they are at
their grandchild’s recital
7. 7
3 The time is right for VR
The promise of VR has excited us for decades, but introducing a new product category to the world is always challenging.
However, the time is right for VR and it is happening now. Ecosystem drivers and technology advancements are aligning to make
VR possible.
Figure 6: Ecosystem drivers and technology advancements are aligning for VR
Device availability
Software infrastructure
Content creation and deployment
Ecosystem
drivers
Multimedia technologies
Display and sensor technologies
Power and thermal efficiency
Technology
advancements
Ecosystem drivers and technology advancements3.1
The key ecosystem pieces—such as device availability, software infrastructure, and content creation and availability—are in place
due to industry collaboration.
Device availability: Besides the upcoming high-profile VR product launches from multiple OEMs in 2016, smartphone-powered VR
headsets are already available and taking off. Google Cardboard, which launched in 2014, has been adopted by consumers and
enterprises. In fact, over 5 million Google Cardboard viewers have shipped2
. Mobile VR headsets, as opposed to tethered, will drive
mass adoption and provide the freedom to enjoy VR anywhere.
Software infrastructure: The entire VR software stack is being optimized to remove bottlenecks so that VR runs well. The software
infrastructure includes many components, such as the drivers, operating system, middleware engines, and tools & SDKs. QTI, for
example, is helping to optimize application programming interfaces (APIs) across the entire software stack.
2
https://googleblog.blogspot.com/2016/01/unfolding-virtual-journey-cardboard.html
8. 8
Content creation and availability: With hardware and software available, VR content is also being created and deployed. Content
developers are experimenting with VR to create new and unprecedented experiences. App Stores, such as the Google Play Store
and Oculus Store, and video stores, such as YouTube 360 and Facebook 360, are starting to fill up. For example, there are over
1000 Google Cardboard apps on Google Play store2
. A broad spectrum of VR video, including TV content such as Saturday Night
Live’s 40th anniversary show, sports content such as the NBA season opener of the Warriors versus Pelicans, or concerts such as
Paul McCartney is already being distributed.
In terms of technology advancements, exponential improvements in many technologies are making VR possible, such as:
Multimedia technologies: Graphics, video, and audio processing have improved significantly. For example, graphics processing for
parallel tasks has increased exponentially for many years, enabling support for photorealistic graphics and visual quality.
Display & sensor technologies: Displays have significantly increased pixel density, power efficiency, and visual quality. Sensors, such
as gyroscopes, accelerometers, and tracking cameras, are smaller, higher precision, lower power, and lower cost.
Power & thermal efficiency: Architecture innovations, such as heterogeneous computing, have improved power and thermal
efficiency. Integration efficiency has improved due to better transistors and Moore’s Law. Optimized algorithms, such as motion
tracking, run more efficiently on the hardware.
The mobile industry is accelerating VR adoption3.2
Many of the technology advancements mentioned above have been driven by smartphones, and the VR ecosystem development
will mirror what happened in the mobile industry. The mobile ecosystem has characteristics that make the proliferation of new
technologies very feasible, such as:
Innovation at scale, which brings both cutting edge technology and cost advantages. Over a billion smartphones are shipping
globally per year, which brings tremendous scale and innovation to mobile.
Rapid design cycles, which bring fast adoption of those cutting edge technologies. Smartphone OEMs have a cadence of shipping
an upgraded model every year.
Mass adoption, which means that smartphone usage has created a broad appeal for mainstream consumers. Smartphone users are
adventurous and willing to try new things, such as downloading new apps from an app store.
4 Truly immersive VR has extreme requirements
VR places extreme requirements on the three pillars of immersion: visual quality, sound quality, and intuitive interactions. Truly
immersive VR experiences can only be achieved by simultaneously focusing on the broader dimensions of these pillars. Adding
9. 9
to the complexity, mobile VR requires full immersion at low power and thermals so that the headset can be sleek, lightweight, and
stylish while remaining cool.
Visual
quality
Sound
quality
Intuitive
interactions
Immersion
High resolution audio
3D audio
Precise motion trackingMinimal latency
Natural user interfaces
Extreme pixel quantity and quality
Spherical view
Stereoscopic display
Figure 7: Truly immersive VR has extreme requirements
Visual quality4.1
To create visuals so realistic that they are eventually indistinguishable from the real world, VR requires extreme pixel quantity and
quality, spherical view, and stereoscopic display.
Extreme pixel quantity and quality
Extreme pixel quantity and quality are required on VR headsets for several reasons, most of which can be explained by human
visual system. For immersive VR, the entire field-of-view (FOV) needs to be the virtual world, otherwise you will not believe that you
are actually present there. The combination of the human eye having a wide FOV and the fovea3
having high visual acuity means
that an extreme number of pixels are required. In a VR headset, the display is brought close to the eyes and biconvex lenses help
magnify the screen further so that the virtual world is the entire FOV. As the screen takes up more of the FOV, pixel density and
quality must increase to maintain presence. Otherwise, you will see individual pixels—known as the screen door effect—and no
longer feel present in the virtual world.
One potential approach to help reduce the quantity of pixels processed is foveated rendering. Foveated rendering exploits the
falloff of acuity in the visual periphery by rendering high resolution where the eye is fixated and lower resolution everywhere else,
thus reducing the total pixels rendered.
3
The fovea is a small portion of the retina in the eye and is responsible for sharp vision.
10. 10
Figure 8: Problem and solution for both lens distortion and chromatic aberration.
The lenses in the VR headset, which help provide the full FOV, also create visual quality challenges (Figure 8). A wide-angle
biconvex lens creates pincushion distortion, so a barrel distortion must be applied to the rendered image to compensate. In
addition, further visual processing is required to correct for chromatic aberration, which causes colors to be focused at different
positions in the focal plane.
Chromatic correction
Chromatic aberration
Solution
Lens distortion
Solution
Pincushion distortionBarrel-warped image Rendered image
Problem
Barrel warp
Problem
Pincushion distortion Warped imageRendered image
Out of
focus
Rendered image
In focus
Corrected image
Spherical view
While wearing a VR headset, the user is free to look anywhere in the virtual world. As a result, VR needs to provide a full 360°
spherical view, which means generating even more pixels as compared to a fixed view of the world. Although a 360° spherical view
is not new for gaming, it is new for video and is being driven by VR. Multiple 360° spherical video formats exist in the industry, such
as equirectangular, cube-map, and pyramid-map, so it important to support them all.
Besides the video decoder being able to handle a high bit-rate, the GPU must also warp each image of the video so that it maps
correctly to the display. Most premium 360° spherical video will be high resolution and content protected, so it is also important
that the GPU supports digital rights management (DRM) extensions.
Stereoscopic display
Seeing the world in 3D is key to immersion. Humans see in 3D due to our binocular vision, which allows us to see objects in the
scene at the right depth. To replicate this on a VR headset, a stereoscopic display shows slightly different images to each eye.
These images are generated from different viewpoints, ideally the actual separation between the human eyes, so that objects
in the scene appear at the right depth. Since stereographic display requires generating and processing an image per eye, this
is another reason for extreme pixel quantity. For VR, we need to generate the appropriate view for each eye with stereographic
rendering and video.
11. 11
Sound quality4.2
To create sound so accurate that it is true-to-life and completely in synch with the visuals, VR requires high resolution audio and
3D audio.
High resolution audio
The sampling frequency and bits-per-sample need to be up to our human hearing capabilities to create high-fidelity sound that
is truly immersive. Increased sampling rates capture both the low frequency sounds, such as water dripping, and high frequency
sounds, such as birds chirping, so that the entire audio environment can be reproduced. Increased precision, or bits-per-sample,
improve audio fidelity. More bits allow the analog sound signal to be reproduced more precisely when converted to digital.
3D audio
Realistic 3D positional audio makes sound accurate to the real world and much more immersive. This goes beyond surround sound
since the sound adjusts dynamically based on your head position and the location of the sound source. For example, if a plane
is flying by, then the sound from the plane will keep adjusting as both you and the plane move. Positional audio is achieved by
understanding how humans hear. A head related transfer function (HRTF) attempts to model how humans hear sound. The HRTF
takes into account typical human facial and body characteristics, such as the location, shape, and size of ear, and is a function of
frequency and three spatial variables. Positional audio requires the HRTF and a 3D audio format, such as scene-based audio or
object-based audio, so that that the sound arrives properly to the ears.
Reverberation also makes sound more realistic. In real life, sound reflections spread and interact with the environment
appropriately. Reverberation is function of sound frequency, material absorption, room volume, and room surface area.
Sophisticated models of the environment can be developed to create real-time convolutional reverberation.
Intuitive interactions4.3
To create interactions so intuitive and responsive that they become second nature and you forget that you are even dealing with
an interface, VR requires natural user interfaces, precise motion tracking, and minimal latency.
Natural user interface
For a VR headset, a primary input method is head movement. Turning your head to look around is very natural and intuitive. Other
input methods, such as gestures, a control pad, and voice, are also being investigated by the VR industry. Figuring out the right
input method is an open debate and a large research area.
The VR headset should be free from wires so that the user can move freely and not be tethered to a fixed-location power outlet
or computer. The headset also needs to be lightweight and comfortable for the user to wear it for long periods of time. In addition,
12. 12
since the headset is a wearable device directly contacting the skin of your head, it must remain cool to the touch.
To meet the extreme VR processing requirements on the device within the thermal and power constraints of a sleek wearable
device, very efficient processing is required. A heterogeneous computing approach4
, which utilizes specialized engines across the
SoC for efficient processing, provides high performance at low power and thermals. To enable a connected mobile experience, a
high-bandwidth wireless connection is required. High bandwidth technologies like LTE Advanced, 802.11ac Wi-Fi, and 802.11ad Wi-
Fi allow for fast downloads and smooth streaming.
Precise motion tracking
The user interface needs accurate on-device motion tracking so that the user can interact with and move freely in the virtual
world. For example, when turning your head to explore the virtual world, accurate head tracking will provide the pose to generate
4
https://www.qualcomm.com/news/onq/2013/07/03/diversity-only-makes-you-stronger
the proper visuals (and sounds). Similarly, if your head
is stable and not moving, the visuals need to stay
completely still, otherwise it may feel like you are on
a boat. Motion is often characterized by how many
degrees of freedom are possible in movement: either 3
degrees of freedom (3-DOF) or 6 degrees of freedom
(6-DOF).
3-DOF detects rotational movement around the X,
Y, and Z axis—the orientation. For head movements,
that means being able to yaw, pitch, and roll your head
(Figure 9), while keeping the rest of your body in the
same location. 3-DOF in VR allows you to look around
the virtual world from fixed points—think of a camera
on a tripod. For many 360° spherical videos, 3-DOF
will provide very immersive content, such as viewing
sporting events or nature.
6-DOF detects rotational movement and translational movement—the orientation and position. This means that your body can
now move from fixed viewpoints in the virtual world in the X, Y, and Z direction. 6-DOF in VR is very beneficial for experiences
like gaming, where you can move freely in the virtual world and look around corners. However, even simple things, like looking at
objects on a desk or shifting your head side-to-side can be compelling with 6-DOF.
One solution to provide precise on-device motion tracking is visual-inertial odometry (VIO). VIO fuses on-device camera and
inertial sensor data to generate an accurate 6-DOF pose. The on-device solution allows the VR headset to be completely mobile,
so you can enjoy room-scale VR and not worry about being tethered to a PC or getting tangled in wires.
Minimal latency
Any lag in the user interface, whether it is the visual or audio, will be very apparent to the user and impact the ability to create
immersion—plus it may make the user feel sick. Reducing system latency is key to stabilizing the virtual world as the user moves.
One of the biggest challenges for VR is the amount of time between an input movement and the screen being updated, which is
known as “motion to photon” latency. The total motion to photon latency must be less than 20 milliseconds (ms) for a good user
X
Z
Y
Pitch
Yaw
Roll
Figure 9: 3-DOF and 6-DOF
13. 13
5
Hardware streaming refers to processing engines that stream data directly to each other, potentially bypassing DRAM and CPU interactions. For example, the camera that streams directly to the DSP saves memory
bandwidth and CPU cycles.
6
Late latching refers to processing engines utilizing the latest head pose (rather than an older version) to generate more accurate visuals and sounds.
7
Described below in section 5.1
experience in VR. To put this challenge in perspective, a display running at 60 Hz is updated every 17 ms, and a display running at
90 Hz is updated every 11 ms.
There are many processing steps required before updating the display. The end-to-end path includes sampling the sensors, sensor
fusion, view generation, render / decode, image correction, and updating the display.
Visual processing
View generation
Render / decode
Motion detection
Sensor sampling
Sensor fusion
Display
Image correction
Quality enhancement and display
“Motion” “Photon”
(new pixels’ lightemitted from the screen)
Figure 10: The motion to photon path includes many tasks.
To minimize the system latency, an end-to-end approach that reduces the latency of individual processing tasks, that runs as
many tasks in parallel as possible, and that takes the appropriate shortcuts for processing tasks is needed. An optimized solution
requires hardware, software, sensors, and display all working together in harmony. Knowledge in all these areas is required to
make the appropriate optimizations and design choices. Possible optimizations to reduce latency include techniques, such as
high sensor sampling speeds, high frame rates, hardware streaming5
, late latching6
, asynchronous time warp7
, and single buffer
rendering7
.
5 QTI is uniquely positioned for VR
We are at just the beginning of the VR journey, and we expect the mobile VR headset to significantly evolve and improve over time.
The industrial design will become sleeker, lighter, and more fashionable as technologies advance (Figure 11). To make this possible,
Google
Cardboard
HMD Sleek
HMD
Imperceptible
device?
Slot-in
Continued improvements in…
Power efficiency
Cost efficiency
Pixel density & quality
Intuitive interactions
Sound quality
Figure 11: Sleeker, lighter, and more fashionable VR devices are coming
14. 14
8
Snapdragon VR SDK is expected to be available to developers in the second quarter of 2016.
we foresee continued improvements coming in many areas, such as power and thermal efficiency, pixel density and pixel quality,
sound quality, intuitive interactions, and cost efficiency.
QTI is uniquely positioned to meet the extreme requirements of immersive VR as the headset evolves. We expect to make VR
experiences more immersive by designing efficient solutions that meet the device constraints and by helping the ecosystem bring
differentiated products to customers.
We see opportunities to improve VR experiences by focusing on the three pillars of immersion. For visual quality, we believe it is
important to provide consistent accurate color, high resolution & frame rates, and 3D 360° spherical view. For sound quality, we
believe it is important to provide realistic 3D positional audio, 3D surround sound, and noise removal. For intuitive interactions,
we believe it is important to provide minimized system latency and precise motion tracking, while delivering intelligent contextual
interactions.
Because of our heritage and technology leadership in smartphones, we are applying our mobile expertise to VR and are well
positioned in the areas of:
• Faster development cycles than other types of devices, with customers who increasingly require more complete
solutions.
• Sleek, passively cooled form factors that get thinner and more challenging each generation.
• Reducing cost of technologies each generation to lower price points, so that our customers can deploy new, more
immersive experiences to consumers worldwide.
We also enable the ecosystem to commercialize devices and experiences via Snapdragon solutions and via ecosystem enablement.
For our Snapdragon solutions, we’ve made the appropriate VR tradeoffs and focused on the right dimensions to design an efficient,
differentiated SoC through:
• Custom designed processing engines: Rather than licensing off-the-shelf processing engines, we have custom designed
several processing engines, such as the Qualcomm® Adreno™ GPU and Qualcomm® Hexagon™ DSP, to be optimized
for specific tasks and efficiency. In addition to designing superior processing engines, we also gain tremendous
knowledge that we use to make system level optimizations.
• Efficient heterogeneous computing: Our system approach allows us to design an optimal heterogeneous computing
solution, taking into account the applications and tasks that need to be accomplished so that we can make appropriate
hardware decisions. We then run the tasks on the most appropriate engines through optimized system-level software.
Snapdragon 820, for example, utilizes specialized engines across the SoC for efficient processing of the diverse VR tasks,
providing high performance at low power and thermals.
• Optimized end-to-end solutions: By thinking holistically at the system level, understanding the challenges, and working
with other companies in the ecosystem, we develop comprehensive VR solutions. For example, we are able to optimize
the entire system to reduce motion to photon latency. Removing the latency bottlenecks requires making optimizations
across the entire SoC and system software. With the knowledge and design expertise in house, we can quickly come up
with innovative solutions and introduce new capabilities.
For ecosystem enablement, QTI helps the ecosystem to quickly bring products to customers through app development tools,
device optimization tools, development platforms, and ecosystem collaboration. For VR specifically, we are introducing the
Snapdragon VR SDK8
, which has new APIs optimized for VR. Developers will be able to use commercially available Snapdragon 820
VR devices to see how VR applications run on real hardware.
15. 15
Snapdragon VR enhancements5.1
Along with the Snapdragon 820 processor, which was designed with VR in mind, QTI announced the Snapdragon VR SDK. The
Snapdragon VR SDK provides developers access to advanced VR features so that they can simplify development, optimize
Stereoscopic renderingDSP sensor fusion
Asynchronous time warp
Power & thermal managementLens distortion correction
Simplifieddevelopment | OptimizedVR performance | Power and thermal efficiency
Benefits
Single buffer rendering
Chromatic aberration correction
APIs optimized
for VR
UI layering
Figure 12: Snapdragon VR SDK offers simplified development, optimized performance, and efficiency
performance, and improve power efficiency while reducing development time.
The Snapdragon VR SDK introduces new APIs optimized for VR, such as:
• DSP sensor fusion: Provides precise low-latency 6-DOF motion tracking, real-time and predicted, via fusion processing
on the Hexagon DSP for improved responsiveness.
• Asynchronous time warp: Warps the image based on the latest head pose just prior to scan out.
• Chromatic aberration correction: Corrects color distortion based on lens characteristics.
• Lens distortion correction: Barrel warps the image based on lens characteristics.
• Stereoscopic rendering: Generates left and right eye view, up to 3200x1800 at 90 fps.
• Single buffer rendering: Renders directly to the display buffer for immediate display scan out.
• UI layering: Generates menus, text, and other overlays such that they appear properly in a VR world (undistorted).
• Power and thermal management: Qualcomm® Symphony System Manager SDK provides CPU, GPU, and DSP power and
performance management to consistently hit 90 FPS.
Other new Snapdragon 820 features that benefit VR include:
• Several enhancements to achieve < 18 ms motion to photon latency
• High-quality 360° spherical video at 4K 60 fps, for both HEVC and VP9
• 3D positional audio for more immersive video and gaming experiences
• Ultra-fast connectivity through LTE Advanced, 802.11ac, and 802.11ad
By offering hardware, software, tools, and devices all optimized for VR, QTI is uniquely positioned to lead in VR. Snapdragon 820 is
the ideal solution for today’s mobile VR.
16. 16
6 Conclusion
We are on the verge of consumer VR becoming a reality. After several false starts, ecosystem drivers and technology
advancements are aligning to make VR possible—and the mobile industry will lead the way. VR offers such new and compelling
experiences that it is going to transform the way we interact with the world. Making those VR experiences truly immersive requires
simultaneously meeting several extreme requirements across visual quality, sound quality, and intuitive interactions. Mobile
VR, which will drive mass consumer adoption, adds additional power and thermal requirements since the headset needs to be
comfortable, lightweight, and cool to the touch.
QTI is uniquely positioned to meet these extreme requirements by designing efficient solutions through custom designed
processing engines, efficient heterogeneous computing, and optimized end-to-end solutions. For example, the Snapdragon 820
was designed with VR in mind to make immersive VR possible on a mobile device. The Snapdragon VR SDK provides developers
access to advanced VR features on the Snapdragon 820. QTI will continue to drive the industry forward as the mobile VR headset
continues to evolve. Enabling truly immersive VR on mobile devices is yet another example of how Qualcomm Technologies is
once again re-inventing the mobile world we live in.
To get most updated information about virtual reality, please visit:
www.qualcomm.com/VR