5G is going mainstream across the globe, and this is an exciting time to harness the low latency and high capacity of 5G to enable the metaverse. A distributed-compute architecture across device and cloud can enable rich extended reality (XR) user experiences. Virtual reality (VR) and mixed reality (MR) are ready for deployment in private networks, while augmented reality (AR) for wide area networks can be enabled in the near term with Wi-Fi powered AR glasses paired with a 5G-enabled phone. Device APIs enabling application adaptation is critical for good user experience. 5G standards are evolving to support the deployment of AR glasses at a large scale and setting the stage for 6G-era with the merging of the physical, digital, and virtual worlds. Techniques like perception-enhanced wireless offer significant potential to improve user experience. Qualcomm Technologies is enabling the XR industry with platforms, developer SDKs, and reference designs.
Check out this webinar to learn:
• How 5G and distributed-compute architectures enable the metaverse
• The latest results from our boundless XR 5G/6G testbed, including device APIs and perception-enhanced wireless
• 5G standards evolution for enhancing XR applications and the road to 6G
• How Qualcomm Technologies is enabling the industry with platforms, SDKs, and reference designs
Transforming enterprise and industry with 5G private networksQualcomm Research
The 3GPP put the spotlight on industry expansion in July with 5G NR Release 16 and set the stage for enterprise and industry verticals to look at how to provide high-performance wireless connectivity with 5G private networks. With a variety of options for spectrum, different network architectures, a rich feature set to meet the demanding needs of the industrial Internet of Things (IIoT), and the privacy and security required for business assurance, 5G private networks are poised to transform enterprise and industry.
Watch the webinar at: https://pages.questexnetwork.com/Webinar-Qualcomm-Registration-101520.html?source=Qualcomm
AI firsts: Leading from research to proof-of-conceptQualcomm Research
AI has made tremendous progress over the past decade, with many advancements coming from fundamental research from many decades ago. Accelerating the pipeline from research to commercialization has been daunting since scaling technologies in the real world faces many challenges beyond the theoretical work done in the lab. Qualcomm AI Research has taken on the task of not only generating novel AI research but also being first to demonstrate proof-of-concepts on commercial devices, enabling technology to scale in the real world. This presentation covers:
The challenges of deploying cutting-edge research on real-world mobile devices
How Qualcomm AI Research is solving system and feasibility challenges with full-stack optimizations to quickly move from research to commercialization
Examples where Qualcomm AI Research has had industrial or academic firsts
Mavenir: Why and How Private LTE & 5G Networks Are Rapidly Evolving for Enter...Mavenir
Dean Bubley, Founder of Disruptive Analysis and well known industry analyst, and Aniruddho Basu, Mavenir SVP/GM of Global Emerging Business, showcase the future of Private LTE & 5G Networks. Presentation from the "Why and How Private LTE & 5G Networks Are Rapidly Evolving for Enterprises" webinar.
5G will transform the IoT, expanding the reach of 5G and mobile technologies beyond smartphones. This presentation talks about how 5G is opening doors to new use cases, what is in the 5G evolution that will address the expanding IoT needs, and what Qualcomm is doing to deliver end-to-end technologies and solutions.
Propelling 5G forward: a closer look at 3GPP Release-16Qualcomm Research
This presentation summarizes the 3GPP 5G NR Release 16 projects, including eMBB enhancements, unlicensed, sidelink, IAB, TSN, eURLLC, private networks, C-V2X, and more...
Enabling the rise of the smartphone: Chronicling the developmental history at...Qualcomm Research
Today’s smartphones are a marvel of modern technology — handheld devices with vast computing power, incredible multimedia and AI capabilities, and blazing fast data rates that support mobile browsing, social media interaction, and more. From humble beginnings as a cellphone focused purely on voice communication, the capability and functionality of modern smartphones have advanced tremendously. This presentation chronicles Qualcomm’s role in the rise of the smartphone from its initial beginnings to becoming the largest computing platform in the world. It includes:
- Key technology developments that led to today’s smartphones
- The role of Moore’s Law in driving new innovations and additional integration into mobile processors
- Qualcomm’s critical role in advancing the smartphone’s capabilities through groundbreaking innovations and key technology developments
- There is a rich roadmap of 5G technologies coming in the second half of the 5G decade with the 5G Advanced evolution
- 6G will be the future innovation platform for 2030 and beyond building on the 5G Advanced foundation
- 6G will be more than just a new radio design, expanding the role of AI, sensing and others in the connected intelligent edge
- Qualcomm is leading cutting-edge wireless research across six key technology vectors on the path to 6G
Transforming enterprise and industry with 5G private networksQualcomm Research
The 3GPP put the spotlight on industry expansion in July with 5G NR Release 16 and set the stage for enterprise and industry verticals to look at how to provide high-performance wireless connectivity with 5G private networks. With a variety of options for spectrum, different network architectures, a rich feature set to meet the demanding needs of the industrial Internet of Things (IIoT), and the privacy and security required for business assurance, 5G private networks are poised to transform enterprise and industry.
Watch the webinar at: https://pages.questexnetwork.com/Webinar-Qualcomm-Registration-101520.html?source=Qualcomm
AI firsts: Leading from research to proof-of-conceptQualcomm Research
AI has made tremendous progress over the past decade, with many advancements coming from fundamental research from many decades ago. Accelerating the pipeline from research to commercialization has been daunting since scaling technologies in the real world faces many challenges beyond the theoretical work done in the lab. Qualcomm AI Research has taken on the task of not only generating novel AI research but also being first to demonstrate proof-of-concepts on commercial devices, enabling technology to scale in the real world. This presentation covers:
The challenges of deploying cutting-edge research on real-world mobile devices
How Qualcomm AI Research is solving system and feasibility challenges with full-stack optimizations to quickly move from research to commercialization
Examples where Qualcomm AI Research has had industrial or academic firsts
Mavenir: Why and How Private LTE & 5G Networks Are Rapidly Evolving for Enter...Mavenir
Dean Bubley, Founder of Disruptive Analysis and well known industry analyst, and Aniruddho Basu, Mavenir SVP/GM of Global Emerging Business, showcase the future of Private LTE & 5G Networks. Presentation from the "Why and How Private LTE & 5G Networks Are Rapidly Evolving for Enterprises" webinar.
5G will transform the IoT, expanding the reach of 5G and mobile technologies beyond smartphones. This presentation talks about how 5G is opening doors to new use cases, what is in the 5G evolution that will address the expanding IoT needs, and what Qualcomm is doing to deliver end-to-end technologies and solutions.
Propelling 5G forward: a closer look at 3GPP Release-16Qualcomm Research
This presentation summarizes the 3GPP 5G NR Release 16 projects, including eMBB enhancements, unlicensed, sidelink, IAB, TSN, eURLLC, private networks, C-V2X, and more...
Enabling the rise of the smartphone: Chronicling the developmental history at...Qualcomm Research
Today’s smartphones are a marvel of modern technology — handheld devices with vast computing power, incredible multimedia and AI capabilities, and blazing fast data rates that support mobile browsing, social media interaction, and more. From humble beginnings as a cellphone focused purely on voice communication, the capability and functionality of modern smartphones have advanced tremendously. This presentation chronicles Qualcomm’s role in the rise of the smartphone from its initial beginnings to becoming the largest computing platform in the world. It includes:
- Key technology developments that led to today’s smartphones
- The role of Moore’s Law in driving new innovations and additional integration into mobile processors
- Qualcomm’s critical role in advancing the smartphone’s capabilities through groundbreaking innovations and key technology developments
- There is a rich roadmap of 5G technologies coming in the second half of the 5G decade with the 5G Advanced evolution
- 6G will be the future innovation platform for 2030 and beyond building on the 5G Advanced foundation
- 6G will be more than just a new radio design, expanding the role of AI, sensing and others in the connected intelligent edge
- Qualcomm is leading cutting-edge wireless research across six key technology vectors on the path to 6G
6G Training Course Part 9: Course Summary & Conclusion3G4G
After our successful launch of '5G for Absolute Beginners' course (http://bit.ly/5Gbegins) in 2020, we decided to create an introductory training course on 6G Mobile Wireless Communications technology. The course is ready and the best way to navigate it is via the Free 6G Training page at: https://bit.ly/6Gintro - this will ensure that you have the latest version of each video and also the most recent version of the 6G technologies videos as and they are added.
In this part we are providing a summary of the course and concluding with the next steps. Hopefully you found this course informative and useful. Do let us know what you thought and how do you think the 6G journey will proceed.
This course is part of #Free6Gtraining initiative (https://www.free6gtraining.com/)
All our #3G4G5G slides and videos are available at:
Videos: https://www.youtube.com/3G4G5G
Slides: https://www.slideshare.net/3G4GLtd
6G and Beyond-5G Page: https://www.3g4g.co.uk/6G/
Free Training Videos: https://www.3g4g.co.uk/Training/
Free 6G Training Blog: https://www.free6gtraining.com/
With the components already introduced to the market, we are making the platform truly end-to-end by launching;
- The market’s first complete 5G radio system
- The first version of an E2E Core network capable of 5G use cases based on network slices
- A 5G core network which can now be connected to 5G NR radio
This enables already today some 5G use cases, for telecom operators to capture growth opportunities for 5G & Internet of Things services for Consumers & Enterprises.
This is a presentation I presented at NVIDIA AI Conference in Korea. It's about building the largest GPU - DGX-2, the most powerful supercomputer in one node.
Enel, AWS, and Athonet: Connecting Millions of IoT Devices on Private LTE (TL...Amazon Web Services
The upcoming launch of unlicensed spectrums globally, including CBRS in the USA (and later MulteFire globally), sXGP in Japan, and LAA in France, opens up new opportunities for deployment of 5G-ready industrial-grade private LTE networks integrated for industrial IoT applications. Deploying a private LTE network requires key considerations of both localized and widely distributed networks for highly resilient, low-latency broadband and narrow band (LTE-M, NB-IoT) LTE communications. In this workshop, we dive deep into how Enel plans to integrate millions of devices across power plants and field devices to AWS IoT using a private LTE network from Athonet to realize a smart electricity enterprise. Enel is one of the world's largest electricity utilities, with a 30-million smart meter program in Italy that has underlying LTE connectivity through a private MVNO. Athonet is a market leader globally in private LTE, with over 100 dedicated LTE networks globally for industry, public safety, and digital use cases, providing the mobile core network for Enel's private MVNO.
The Internet of Things is bringing a massive surge of smart, connected devices that will enable new services and efficiencies across industries. This requires wireless technologies to scale up or down depending on the application performance needs—to connect virtually anything. And now, LTE is evolving for low-throughput, delay-tolerant IoT use cases. The new narrowband LTE technologies (eMTC & NB-IoT) will deliver lower complexity, longer battery life, and deeper coverage for wide-area IoT applications.
5G + AI: The Ingredients For Next Generation Wireless InnovationQualcomm Research
5G and AI are two of the most disruptive technologies the world has seen in decades. While each is individually revolutionizing industries and enabling new experiences, the combination of both 5G and AI is going to be truly transformative. Applying AI not only to the 5G network but also the device will lead to more efficient wireless communications, longer battery life and enhanced user experiences. The low latency and high capacity of 5G will also allow AI processing to be distributed amongst the device, edge cloud and central cloud, enabling flexible system solutions for a variety of use cases. At Qualcomm Technologies, we are not only working on cutting-edge research for 5G and AI, but we are also exploring their synergies to realize our vision of the future. View this presentation to learn how AI is making 5G better -- in the network and on the device, why on-device AI processing is essential, and how 5G is empowering distributed learning over wireless.
After our successful launch of '5G for Absolute Beginners' course (http://bit.ly/5Gbegins) in 2020, we decided to create an introductory training course on 6G Mobile Wireless Communications technology. The course is ready and the best way to navigate it is via the Free 6G Training page at: https://bit.ly/6Gintro - this will ensure that you have the latest version of each video and also the most recent version of the 6G technologies videos as and they are added.
In this part we will look at how and why the industry and research community believes that things will be very different in 2030 and to get ready for that era, we need to start looking at and defining 6G today. While some believe that there will be an intermediate 5.5G or Beyond 5G step before jumping directly on to 6G, others believe that 6G will require step change that 5G evolution may not achieve satisfactorily.
This course is part of #Free6Gtraining initiative (https://www.free6gtraining.com/)
All our #3G4G5G slides and videos are available at:
Videos: https://www.youtube.com/3G4G5G
Slides: https://www.slideshare.net/3G4GLtd
6G and Beyond-5G Page: https://www.3g4g.co.uk/6G/
Free Training Videos: https://www.3g4g.co.uk/Training/
Free 6G Training Blog: https://www.free6gtraining.com/
Get a better understanding of 5G in this "Introduction to 5G"presentation by Doug Hohulin, Nokia 4G/5G Mobile Technology, whose focus is the strategy and business development of AV, UAS, Smart City, IoT and 5G technologies. This was part of Doug's presentation at the 2017 Gigabit City Summit (GCS17)
3GPP Release 17: Completing the first phase of 5G evolutionQualcomm Research
This presentation summarizes 5G NR Release 17 projects that was completed in March 2022. It further enhances 5G foundation and expands into new devices, use cases, verticals.
This presentation and video looks at the concept of Open RAN, White Bix RAN and Virtualized RAN (vRAN). It looks at the motivation to move away from traditional architectures where the vendor supplies their own proprietary hardware and software to the new Open RAN architecture movement. Business case from an MNO / SP point of view is discussed and the results from joint Open RAN RFI by Telefonica and Vodafone is discussed.
A detailed look at what is meant by private networks, why do we need them and why the sudden interest in them. Also discussed is the 3GPP defined 5G Non-Public Networks (NPN), they architecture, implementation, pros and cons. In addition RAN sharing and Campus Networks are also discussed with regards to where they fit in the private networks.
All our #3G4G5G slides and videos are available at:
Videos: https://www.youtube.com/3G4G5G
Slides: https://www.slideshare.net/3G4GLtd
5G Page: https://www.3g4g.co.uk/5G/
Free Training Videos: https://www.3g4g.co.uk/Training/
This presentation covers an industry perspective and a roadmap towards 5G with open and democratized interfaces. It covers examples of open reference platforms and how open source communities can complement standard bodies such as 3GPP and IEEE. It characterizes RAN and user and control plane core micro services and discusses opportunities for embedded network telemetry for emerging machine learning applications.
Speaker: Tom Tofigh, Principal Member of Technical Staff (Architect) at AT&T
This presentation outlines the synergistic nature of 5G and AI -- two disruptive areas of innovations that can change the world. It illustrates the benefits of adopting AI for the advancements of 5G, as well as showcases the latest progress made by Qualcomm Technologies, Inc.
The need for intelligent, personalized experiences powered by AI is ever-growing. Our devices are producing more and more data that could help improve our AI experiences. How do we learn and efficiently process all this data from edge devices while maintaining privacy? On-device learning rather than cloud training can address these challenges. In this presentation, we’ll discuss:
- Why on-device learning is crucial for providing intelligent, personalized experiences without sacrificing privacy
- Our latest research in on-device learning, including few-shot learning, continuous learning, and federated learning
- How we are solving system and feasibility challenges to move from research to commercialization
Setting off the 5G Advanced evolution with 3GPP Release 18Qualcomm Research
In December 2021, 3GPP has reached a consensus on the scope of 5G NR Release 18. This is a significant milestone marking the beginning of 5G Advanced — the second wave of wireless innovations that will fulfill the 5G vision. Release 18 will build on the solid foundation set by Releases 15, 16, and 17, and it sets the longer-term evolution direction of 5G and beyond. This release will encompass a wide range of new and enhancement projects, ranging from improved MIMO and application of AI/ML-enabled air interface to extended reality optimizations and broader IoT support.
5G is designed to serve an unprecedented range of capabilities with a single global standard. With enhanced mobile broadband (eMBB), massive IoT (mIoT), and mission-critical IoT, the three pillars of 5G represent extremes in performance and associated complexity. For IoT services, NB-IoT and eMTC devices prioritize low power consumption and the lowest complexity for wide-area deployments (LPWA), while enhanced ultra-reliable, low-latency communication (eURLLC), along with time-sensitive networking (TSN), delivers the most stringent use case requirements. But there exists an opportunity to more efficiently address a broad range of mid-tier applications with capabilities ranging between these extremes.
In 5G NR Release 17, 3GPP introduced a new tier of reduced capability (RedCap) devices, also known as NR-Light. It is a new device platform that bridges the capability and complexity gap between the extremes in 5G today with an optimized design for mid-tier use cases. With the recent standards completion, NR-Light is set to efficiently expand the 5G universe to connect new frontiers.
Download this presentation to learn:
• What NR-Light is and why it can herald the next wave of 5G expansion
• How NR-Light is accelerating the growth of the connected intelligent edge
• Why NR-Light is a suitable 5G migration path for mid-tier LTE devices
6G Training Course Part 9: Course Summary & Conclusion3G4G
After our successful launch of '5G for Absolute Beginners' course (http://bit.ly/5Gbegins) in 2020, we decided to create an introductory training course on 6G Mobile Wireless Communications technology. The course is ready and the best way to navigate it is via the Free 6G Training page at: https://bit.ly/6Gintro - this will ensure that you have the latest version of each video and also the most recent version of the 6G technologies videos as and they are added.
In this part we are providing a summary of the course and concluding with the next steps. Hopefully you found this course informative and useful. Do let us know what you thought and how do you think the 6G journey will proceed.
This course is part of #Free6Gtraining initiative (https://www.free6gtraining.com/)
All our #3G4G5G slides and videos are available at:
Videos: https://www.youtube.com/3G4G5G
Slides: https://www.slideshare.net/3G4GLtd
6G and Beyond-5G Page: https://www.3g4g.co.uk/6G/
Free Training Videos: https://www.3g4g.co.uk/Training/
Free 6G Training Blog: https://www.free6gtraining.com/
With the components already introduced to the market, we are making the platform truly end-to-end by launching;
- The market’s first complete 5G radio system
- The first version of an E2E Core network capable of 5G use cases based on network slices
- A 5G core network which can now be connected to 5G NR radio
This enables already today some 5G use cases, for telecom operators to capture growth opportunities for 5G & Internet of Things services for Consumers & Enterprises.
This is a presentation I presented at NVIDIA AI Conference in Korea. It's about building the largest GPU - DGX-2, the most powerful supercomputer in one node.
Enel, AWS, and Athonet: Connecting Millions of IoT Devices on Private LTE (TL...Amazon Web Services
The upcoming launch of unlicensed spectrums globally, including CBRS in the USA (and later MulteFire globally), sXGP in Japan, and LAA in France, opens up new opportunities for deployment of 5G-ready industrial-grade private LTE networks integrated for industrial IoT applications. Deploying a private LTE network requires key considerations of both localized and widely distributed networks for highly resilient, low-latency broadband and narrow band (LTE-M, NB-IoT) LTE communications. In this workshop, we dive deep into how Enel plans to integrate millions of devices across power plants and field devices to AWS IoT using a private LTE network from Athonet to realize a smart electricity enterprise. Enel is one of the world's largest electricity utilities, with a 30-million smart meter program in Italy that has underlying LTE connectivity through a private MVNO. Athonet is a market leader globally in private LTE, with over 100 dedicated LTE networks globally for industry, public safety, and digital use cases, providing the mobile core network for Enel's private MVNO.
The Internet of Things is bringing a massive surge of smart, connected devices that will enable new services and efficiencies across industries. This requires wireless technologies to scale up or down depending on the application performance needs—to connect virtually anything. And now, LTE is evolving for low-throughput, delay-tolerant IoT use cases. The new narrowband LTE technologies (eMTC & NB-IoT) will deliver lower complexity, longer battery life, and deeper coverage for wide-area IoT applications.
5G + AI: The Ingredients For Next Generation Wireless InnovationQualcomm Research
5G and AI are two of the most disruptive technologies the world has seen in decades. While each is individually revolutionizing industries and enabling new experiences, the combination of both 5G and AI is going to be truly transformative. Applying AI not only to the 5G network but also the device will lead to more efficient wireless communications, longer battery life and enhanced user experiences. The low latency and high capacity of 5G will also allow AI processing to be distributed amongst the device, edge cloud and central cloud, enabling flexible system solutions for a variety of use cases. At Qualcomm Technologies, we are not only working on cutting-edge research for 5G and AI, but we are also exploring their synergies to realize our vision of the future. View this presentation to learn how AI is making 5G better -- in the network and on the device, why on-device AI processing is essential, and how 5G is empowering distributed learning over wireless.
After our successful launch of '5G for Absolute Beginners' course (http://bit.ly/5Gbegins) in 2020, we decided to create an introductory training course on 6G Mobile Wireless Communications technology. The course is ready and the best way to navigate it is via the Free 6G Training page at: https://bit.ly/6Gintro - this will ensure that you have the latest version of each video and also the most recent version of the 6G technologies videos as and they are added.
In this part we will look at how and why the industry and research community believes that things will be very different in 2030 and to get ready for that era, we need to start looking at and defining 6G today. While some believe that there will be an intermediate 5.5G or Beyond 5G step before jumping directly on to 6G, others believe that 6G will require step change that 5G evolution may not achieve satisfactorily.
This course is part of #Free6Gtraining initiative (https://www.free6gtraining.com/)
All our #3G4G5G slides and videos are available at:
Videos: https://www.youtube.com/3G4G5G
Slides: https://www.slideshare.net/3G4GLtd
6G and Beyond-5G Page: https://www.3g4g.co.uk/6G/
Free Training Videos: https://www.3g4g.co.uk/Training/
Free 6G Training Blog: https://www.free6gtraining.com/
Get a better understanding of 5G in this "Introduction to 5G"presentation by Doug Hohulin, Nokia 4G/5G Mobile Technology, whose focus is the strategy and business development of AV, UAS, Smart City, IoT and 5G technologies. This was part of Doug's presentation at the 2017 Gigabit City Summit (GCS17)
3GPP Release 17: Completing the first phase of 5G evolutionQualcomm Research
This presentation summarizes 5G NR Release 17 projects that was completed in March 2022. It further enhances 5G foundation and expands into new devices, use cases, verticals.
This presentation and video looks at the concept of Open RAN, White Bix RAN and Virtualized RAN (vRAN). It looks at the motivation to move away from traditional architectures where the vendor supplies their own proprietary hardware and software to the new Open RAN architecture movement. Business case from an MNO / SP point of view is discussed and the results from joint Open RAN RFI by Telefonica and Vodafone is discussed.
A detailed look at what is meant by private networks, why do we need them and why the sudden interest in them. Also discussed is the 3GPP defined 5G Non-Public Networks (NPN), they architecture, implementation, pros and cons. In addition RAN sharing and Campus Networks are also discussed with regards to where they fit in the private networks.
All our #3G4G5G slides and videos are available at:
Videos: https://www.youtube.com/3G4G5G
Slides: https://www.slideshare.net/3G4GLtd
5G Page: https://www.3g4g.co.uk/5G/
Free Training Videos: https://www.3g4g.co.uk/Training/
This presentation covers an industry perspective and a roadmap towards 5G with open and democratized interfaces. It covers examples of open reference platforms and how open source communities can complement standard bodies such as 3GPP and IEEE. It characterizes RAN and user and control plane core micro services and discusses opportunities for embedded network telemetry for emerging machine learning applications.
Speaker: Tom Tofigh, Principal Member of Technical Staff (Architect) at AT&T
This presentation outlines the synergistic nature of 5G and AI -- two disruptive areas of innovations that can change the world. It illustrates the benefits of adopting AI for the advancements of 5G, as well as showcases the latest progress made by Qualcomm Technologies, Inc.
The need for intelligent, personalized experiences powered by AI is ever-growing. Our devices are producing more and more data that could help improve our AI experiences. How do we learn and efficiently process all this data from edge devices while maintaining privacy? On-device learning rather than cloud training can address these challenges. In this presentation, we’ll discuss:
- Why on-device learning is crucial for providing intelligent, personalized experiences without sacrificing privacy
- Our latest research in on-device learning, including few-shot learning, continuous learning, and federated learning
- How we are solving system and feasibility challenges to move from research to commercialization
Setting off the 5G Advanced evolution with 3GPP Release 18Qualcomm Research
In December 2021, 3GPP has reached a consensus on the scope of 5G NR Release 18. This is a significant milestone marking the beginning of 5G Advanced — the second wave of wireless innovations that will fulfill the 5G vision. Release 18 will build on the solid foundation set by Releases 15, 16, and 17, and it sets the longer-term evolution direction of 5G and beyond. This release will encompass a wide range of new and enhancement projects, ranging from improved MIMO and application of AI/ML-enabled air interface to extended reality optimizations and broader IoT support.
5G is designed to serve an unprecedented range of capabilities with a single global standard. With enhanced mobile broadband (eMBB), massive IoT (mIoT), and mission-critical IoT, the three pillars of 5G represent extremes in performance and associated complexity. For IoT services, NB-IoT and eMTC devices prioritize low power consumption and the lowest complexity for wide-area deployments (LPWA), while enhanced ultra-reliable, low-latency communication (eURLLC), along with time-sensitive networking (TSN), delivers the most stringent use case requirements. But there exists an opportunity to more efficiently address a broad range of mid-tier applications with capabilities ranging between these extremes.
In 5G NR Release 17, 3GPP introduced a new tier of reduced capability (RedCap) devices, also known as NR-Light. It is a new device platform that bridges the capability and complexity gap between the extremes in 5G today with an optimized design for mid-tier use cases. With the recent standards completion, NR-Light is set to efficiently expand the 5G universe to connect new frontiers.
Download this presentation to learn:
• What NR-Light is and why it can herald the next wave of 5G expansion
• How NR-Light is accelerating the growth of the connected intelligent edge
• Why NR-Light is a suitable 5G migration path for mid-tier LTE devices
Entendre els fonaments de la tecnologia i conèixer com ens impactarà el seu desplegament. Vam descobrir les diferents aplicacions que tindrà el 5G en el nostre dia a dia i el valor que aportarà
Services and applications’ infrastructure for agile optical networksTal Lavian Ph.D.
Huge advancements in optical devices, components and networking.
The underline of the Internet is optical – How can we take advantage of this?
How can the applications take advantage of this?
Agile Optical Network is starting to appear. What services and interfaces we’ll need between the optical control and the applications?
What are the applications?
The Internet architecture was built on some 15-20 years old assumptions. Are some modifications needed?
Is packet switching good for all? In some cases, is circuit switching better? (move TeraBytes of SAN date, P2P, Streaming)
End-to-End Argument – Is is valid for all cases?
What cases not? What instead?
The current Internet architecture is based on L3. What is needed in order to offer services in L1-L2?
Computation vs. Bandwidth 10X in 5 years
5G will connect virtually everything around us to transform a wide range of industries — manufacturing, automotive, logistics, and many more, and we are on track to make 5G NR — the global 5G standard — a commercial reality by 2019. However, this first phase of 5G mainly focuses on enhanced mobile broadband services, which will contribute to part of the total projected $12T 5G economy. 5G NR will continue to evolve in Release 16 and beyond to further expand 5G’s reach to new devices, services, and ecosystem players.
InfiniBand In-Network Computing Technology and Roadmapinside-BigData.com
In this deck from the MVAPICH User Group, Gilad Shainer from Mellanox presents: InfiniBand In-Network Computing Technology and Roadmap.
"In-Network Computing transforms the data center interconnect to become a "distributed CPU", and "distributed memory", enables to overcome performance barriers and to enable faster and more scalable data analysis. HDR 200G InfiniBand In-Network Computing technology includes several elements - Scalable Hierarchical Aggregation and Reduction Protocol (SHARP), smart Tag Matching and rendezvoused protocol, and more. These technologies are in use at some of the recent large scale supercomputers around the world, including the top TOP500 platforms. The session will discuss the InfiniBand In-Network Computing technology and performance results, as well as view to future roadmap."
Watch the video: https://wp.me/p3RLHQ-kIC
Learn more: http://mellanox.com
and
http://mug.mvapich.cse.ohio-state.edu/program/
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter
Presentation that explores the specific requirements 5G has to deliver in order to support the next generation of VR and AR experiences.
https://www.qualcomm.com/invention/extended-reality
5G promises to change the way we live our lives, with unprecedented services and unparalleled user experience. Operators need to build an underlying connectivity infrastructure that is capable of delivering on demands like ultra-low latency and hyper-flexible bandwidth. This webinar will focus on the most critical aspects of the 5G transport network and discuss what is required in terms of slicing, edge computing and the need for openness and interworking. Addressing each of these aspects properly will enable operators to offer state of the art 5G services that will be the foundation of what some people believe will become the 4th industrial revolution.
Topics of discussion:
What factors and demands will influence the infrastructure design?
The impact of 5G on connectivity infrastructure and network requirements
The optional technological solutions and preferred solutions
Similar to Enabling the metaverse with 5G- web.pdf (20)
Generative AI models, such as ChatGPT and Stable Diffusion, can create new and original content like text, images, video, audio, or other data from simple prompts, as well as handle complex dialogs and reason about problems with or without images. These models are disrupting traditional technologies, from search and content creation to automation and problem solving, and are fundamentally shaping the future user interface to computing devices. Generative AI can apply broadly across industries, providing significant enhancements for utility, productivity, and entertainment. As generative AI adoption grows at record-setting speeds and computing demands increase, on-device and hybrid processing are more important than ever. Just like traditional computing evolved from mainframes to today’s mix of cloud and edge devices, AI processing will be distributed between them for AI to scale and reach its full potential.
In this presentation you’ll learn about:
- Why on-device AI is key
- Full-stack AI optimizations to make on-device AI possible and efficient
- Advanced techniques like quantization, distillation, and speculative decoding
- How generative AI models can be run on device and examples of some running now
- Qualcomm Technologies’ role in scaling on-device generative AI
As generative AI adoption grows at record-setting speeds and computing demands increase, hybrid processing is more important than ever. But just like traditional computing evolved from mainframes and thin clients to today’s mix of cloud and edge devices, AI processing must be distributed between the cloud and devices for AI to scale and reach its full potential. In this talk you’ll learn:
• Why on-device AI is key
• Which generative AI models can run on device
• Why the future of AI is hybrid
• Qualcomm Technologies’ role in making hybrid AI a reality
Qualcomm Webinar: Solving Unsolvable Combinatorial Problems with AIQualcomm Research
How do you find the best solution when faced with many choices? Combinatorial optimization is a field of mathematics that seeks to find the most optimal solutions for complex problems involving multiple variables. There are numerous business verticals that can benefit from combinatorial optimization, whether transport, supply chain, or the mobile industry.
More recently, we’ve seen gains from AI for combinatorial optimization, leading to scalability of the method, as well as significant reductions in cost. This method replaces the manual tuning of traditional heuristic approaches with an AI agent that provides a fast metric estimation.
In this presentation you will find out:
Why AI is crucial in combinatorial optimization
How it can be applied to two use cases: improving chip design and hardware-specific compilers
The state-of-the-art results achieved by Qualcomm AI Research
3D perception is crucial for understanding the real world. It offers many benefits and new capabilities over 2D across diverse applications, from XR and autonomous driving to IOT, camera, and mobile. 3D perception with machine learning is creating the new state of the art (SOTA) in areas, such as depth estimation, object detection, and neural scene representation. Making these SOTA neural networks feasible for real-world deployment on mobile devices constrained by power, thermal, and performance has been a challenge. Qualcomm AI Research has developed not only novel AI techniques for 3D perception but also full-stack AI optimizations to enable real-world deployments and energy-efficient solutions. This presentation explores the latest research that is enabling efficient 3D perception while maintaining neural network model accuracy. You’ll learn about:
- The advantages of 3D perception over 2D and the need for 3D perception across applications
- Advancements in 3D perception research by Qualcomm AI Research
- Our future 3D perception research directions
AI model efficiency is crucial for making AI ubiquitous, leading to smarter devices and enhanced lives. Besides the performance benefit, quantized neural networks also increase power efficiency for two reasons: reduced memory access costs and increased compute efficiency.
The quantization work done by the Qualcomm AI Research team is crucial in implementing machine learning algorithms on low-power edge devices. In network quantization, we focus on both pushing the state-of-the-art (SOTA) in compression and making quantized inference as easy to access as possible. For example, our SOTA work on oscillations in quantization-aware training that push the boundaries of what is possible with INT4 quantization. Furthermore, for ease of deployment, the integer formats such as INT16 and INT8 give comparable performance to floating point, i.e., FP16 and FP8, but have significantly better performance-per-watt performance. Researchers and developers can make use of this quantization research to successfully optimize and deploy their models across devices with open-sourced tools like AI Model Efficiency Toolkit (AIMET).
Presenters: Tijmen Blankevoort and Chirag Patel
Bringing AI research to wireless communication and sensingQualcomm Research
AI for wireless is already here, with applications in areas such as mobility management, sensing and localization, smart signaling and interference management. Recently, Qualcomm Technologies has prototyped the AI-enabled air interface and launched the Qualcomm 5G AI Suite. These developments are possible thanks to expertise in both wireless and machine learning from over a decade of foundational research in these complementing fields.
Our approach brings together the modeling flexibility and computational efficiency of machine learning and the out-of-domain generalization and interpretability of wireless domain expertise.
In this webinar, Qualcomm AI Research presents an overview of state-of-the-art research at the intersection of the two fields and offers a glimpse into the future of the wireless industry.
Qualcomm AI Research is an initiative of Qualcomm Technologies, Inc.
Speakers:
Arash Behboodi, Machine Learning Research Scientist (Senior Staff Engineer/Manager), Qualcomm AI Research Daniel Dijkman, Machine Learning Research Scientist (Principal Engineer), Qualcomm AI Research
How will sidelink bring a new level of 5G versatility.pdfQualcomm Research
Today, the 5G system mainly operates on a network-to-device communication model, exemplified by enhanced mobile broadband use cases where all data transmissions are between the network (i.e., base station) and devices (e.g., smartphone). However, to fully deliver on the original 5G vision of supporting diverse devices, services, and deployment scenarios, we need to expand the 5G topology further to reach new levels of performance and efficiency.
That is why sidelink communication was introduced in 3GPP standards, designed to facilitate direct communication between devices, independent of connectivity via the cellular infrastructure. Beyond automotive communication, it also benefits many other 5G use cases such as IoT, mobile broadband, and public safety.
Realizing mission-critical industrial automation with 5GQualcomm Research
Manufacturers seeking better operational efficiencies, with reduced downtime and higher yield, are at the leading edge of the Industry 4.0 transformation. With mobile system components and reliable wireless connectivity between them, flexible manufacturing systems can be reconfigured quickly for new tasks, to troubleshoot issues, or in response to shifts in supply and demand.
There is a long history of R&D collaboration between Bosch Rexroth and Qualcomm Technologies for the effective application of these 5G capabilities to industrial automation use cases. At the Robert Bosch Elektronik GmbH factory in Salzgitter, Germany, this collaboration has reached new heights.
Download this deck to learn how:
• Qualcomm Technologies and Bosch Rexroth are collaborating to accelerate the Industry 4.0 transformation
• 5G technologies deliver key capabilities for mission-critical industrial automation
• Distributed control solutions can work effectively across 5G TSN networks
• A single 5G technology platform solves connectivity and positioning needs for flexible manufacturing
Cellular networks have facilitated positioning in addition to voice or data communications from the beginning, since 2G, and we’ve since grown to rely on positioning technology to make our lives safer, simpler, more productive, and even fun. Cellular positioning complements other technologies to operate indoors and outdoors, including dense urban environments where tall buildings interfere with satellite positioning. It works whether we’re standing still, walking, or in a moving vehicle. With 5G, cellular positioning breaks new ground to bring robust precise positioning indoors and outdoors, to meet even the most demanding Industry 4.0 needs.
As we look to the future, the Connected Intelligent Edge will bring a new dimension of positional insight to a broad range of devices, improving wireless use cases still under development. We’re already charting the course to 5G Advanced and beyond by working on the evolution of cellular positioning technology to include RF sensing for situational awareness.
Download the deck to learn more.
Data compression has increased by leaps and bounds over the years due to technical innovation, enabling the proliferation of streamed digital multimedia and voice over IP. For example, a regular cadence of technical advancement in video codecs has led to massive reduction in file size – in fact, up to a 1000x reduction in file size when comparing a raw video file to a VVC encoded file. However, with the rise of machine learning techniques and diverse data types to compress, AI may be a compelling tool for next-generation compression, offering a variety of benefits over traditional techniques. In this presentation we discuss:
- Why the demand for improved data compression is growing
- Why AI is a compelling tool for compression in general
- Qualcomm AI Research’s latest AI voice and video codec research
- Our future AI codec research work and challenges
Artificial Intelligence (AI), specifically deep learning, is revolutionizing industries, products, and core capabilities by delivering dramatically enhanced experiences. However, the deep neural networks of today use too much memory, compute, and energy. To make AI truly ubiquitous, it needs to run on the end device within tight power and thermal budgets. Advancements in multiple areas are necessary to improve AI model efficiency, including quantization, compression, compilation, and neural architecture search (NAS). In this presentation, we’ll discuss:
- Qualcomm AI Research’s latest model efficiency research
- Our new NAS research to optimize neural networks more easily for on-device efficiency
- How the AI community can take advantage of this research though our open-source projects, such as the AI Model Efficiency Toolkit (AIMET) and AIMET Model Zoo
How to build high performance 5G networks with vRAN and O-RANQualcomm Research
5G networks are poised to deliver an unprecedented amount of data from a richer set of use cases than we have ever seen. This makes efficient networking in terms of scalability, cost, and power critical for the sustainable growth of 5G. Cloud technologies such as virtualization, containerization and orchestration are now powering a surge of innovation in virtualized radio access network (vRAN) infrastructure with modular hardware and software components, and standardized interfaces. While commercial off-the-shelf (COTS) hardware platforms provide the compute capacity for running vRAN software, hardware accelerators will also play a major role in offloading real-time and complex signal processing functions. Together, COTS platforms and hardware accelerators provide the foundation for building the intelligent 5G network and facilitate innovative new use cases with the intelligent wireless edge.
This presentation takes a look at the technology roadmap for 5G NR millimeter wave (mmWave). Including features such as integrated access and backhaul (IAB), enhancements in beam management, mobility, coverage, and more. For more information, please visit www.qualcomm.com/mmwave
Video data is abundant and being generated at ever increasing rates. Analyzing video with AI can provide valuable insights and capabilities for many applications ranging from autonomous driving and smart cameras to smartphones and extended reality. However, as video resolution and frame rates increase while AI video perception models become more complex, running these workloads in real time is becoming more challenging. This presentation explores the latest research that is enabling efficient video perception while maintaining neural network model accuracy. You’ll learn about:
- How video perception is crucial for understanding the world and making devices smarter
- The challenges of on-device real-time video perception at high resolution through AI
- Qualcomm AI Research’s latest research and techniques for efficient video perception
Checkout: https://www.qualcomm.com/AI
This presentation provides an overview of important 5G innovations around new and enhanced use of spectrum. It also captures the current 5G spectrum status across the globe.
Today, we take it for granted that our mobile devices and applications just work out of the box — smartphones can roam virtually anywhere in the world, laptops can seamlessly connect to any Wi-Fi access point & Bluetooth peripheral, and the videos recorded on one device can be played back perfectly on any other device.
The magic behind all this? Technology standards. Not only do they power a wide range of systems and devices but also bring many benefits to the broader technology ecosystem. At Qualcomm Technologies, we are leading the standardization of many key technologies that will move the world forward.
Download this presentation to learn:
- The value of technology standards, specifically in the areas of cellular, Wi-Fi, Bluetooth, and video codecs
- Why standardized technologies are essential for industry growth and ecosystem development
- How standard bodies operate in a complex, challenging, and ever changing environment
- How Qualcomm is driving innovation in different technology standards
Artificial Intelligence (AI) is revolutionizing industries, products, and core capabilities by delivering dramatically enhanced experiences. However, this is just the start of the AI revolution. The field of AI, especially deep learning, is still in its infancy with tremendous opportunity for exploration and improvement. For instance, deep neural networks of today are rapidly growing in size and use too much memory, compute, and energy. To make AI truly ubiquitous, it needs to run on the end device within a tight power and thermal budget. New approaches and fundamental research in AI, as well as applying that research, is required to advance machine learning further and speed up adoption. View this presentation to learn about select research topics that Qualcomm AI Research is investigating, including:
o AI model optimization research for power efficiency, including our latest quantization research
o Applied AI research, such as using deep learning for improved radar functionality
o Fundamental AI research, such as source compression and quantum AI
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
2. 2
Delivering on the 5G vision
Where virtually everyone and
everything is intelligently connected
3. 3
5G is built for low-latency and high reliability applications like XR
Our fundamental research has been contributed to 3GPP
DL: downlink; UL: Uplink; TRP: Transmission and Reception Point
Shorter time slots Fast processing times Reliable beam forming
TRP TRP
0.125ms
0.5 ms
1ms
Low-band
Mid-band
mmWave
Data
DL
Ctrl
UL
Ctrl
Data
UL Ctrl
UL Ctrl
5. 5
Persistent spatial internet with
personalized digital experiences
Spans both physical and
virtual worlds
Shared virtual space in VR today,
evolving to digitally enhanced
physical space with AR & MR
Snapdragon is a product of Qualcomm Technologies, Inc. and/or its
subsidiaries.
Metaverse
Metaverse
Your ticket to the metaverse
6. 6
The
“Next Platform”
XR is evolving to be the next computing platform
1 – 4 years
Today
6G Metaverse
Standalone
VR and AR
Standalone
VR and AR
AR Viewer
Cabled
AR Viewer
Wireless
Wi-Fi
7. 7
ATW: Asynchronous Time Warp
A distributed
compute
architecture
enables rich XR
user experience
M2R2P = Edge processing + 5G round-trip time + Device processing
Game engine
Encoder
Application
Post processing
(ATW, error
concealment, …)
Decoder
Perception
algorithms
Compressed rendered frames
6DoF HMD pose, controller, eye/hand tracking
Motion-to-render-to-photon latency
M2R2P
8. 8
8
Snapdragon is a product of Qualcomm Technologies, Inc. and/or its subsidiaries. M2R2P: Motion-to-Render-to-Photon
Boundless VR 5G operator trials
Ready for deployments in private networks
Device optimizations
with 25% tail
latency improvements
Less than 20ms
5G round-trip time
Less than
70ms M2R2P latency
10. 10
10
Device optimizations
with 38% tail
latency improvements
Less than 30ms
5G + Wi-Fi
round-trip times1
Target 50-
100ms M2R2P
Targets wide area deployments
Cloud-to-phone-to-AR glasses
Targets wide area deployments
Demo video
1: More immersive AR experiences for indoor applications require less than 20 ms
round-trip time.
M2R2P: Motion-to-render-to-photon latency
11. 11
Enabling 5G-powered
AR glasses
Optimized edge processing
Migration from central cloud to local edge
Low-power, low-latency 5G
3GPP based features
5G modem APIs
Enabling low latency on-device optimizations
Enabling applications to adapt to
RF/network conditions
Improved Infra schedulers1
Delay aware schedulers to meet latency QoS
1 For example, using Application Data Unit (ADU) Error Rate and Delay
Budget vs. Packet Error Rate (PER) and Packet Delay Budget (PDB)
Edge cloud
5G NR
12. 12
5G modem API
to support low-latency
application requirements
Packet
prioritization
Dynamic packet
processing on 5G
protocol stack
Better beam
management for
improved reliability
SDAP
PDCP
RLC
MAC
PHY
Decode relevant packets in N ms
Buffer Buffer
Higher priority
Lower priority
13. 13
5G modem API for
improved application
adaptation and UX
Demo video
Improved
video quality
Less stuttering
Faster rate
adaptation
14. 14
3GPP is developing 5G Advanced on the path to 6G
Continued 5G evolution in the 6G era
5G Advanced
2nd wave of 5G innovations
5G
A unified platform for innovations
Rel-17
Rel-16
Rel-18
2018 2020
2019 2022
2021 2025
2023 2024 2026 2028
2027 2029 2030+
Rel-19
Rel-20
WRC-31
WRC-27
WRC-19 WRC-23
Rel-21
Rel-20
Next technology leap for new
capabilities and efficiencies
Rel-15
Foundational
research
Vision forming Trials IoDTs
Service requirements
3GPP 6G
Workshop
Study Item (proposals) Work Item
15. 15
15
3GPP enhancements for XR device latency and power efficiency
LTE Advanced Pro
• UL configured grant: Handle periodic traffic
• Slot aggregation: Reduce HARQ latencies
• UL skipping: Save power if no data
3GPP Release 15/16/17 XR techniques
Rel-15/16: Low power modes Rel-16: Uplink enhancements Rel-17: XR burst handling
• PDCCH skipping: Faster transition to sleep
after XR burst
• Bandwidth part: Reduce BW if no/low data
• Scell dormancy: Switch to low power if no data
• Cross-slot scheduling: Gap between control
and data for sleep.
PDCCH: Physical Downlink Control Channel; UL: Uplink; Scell: Secondary cell; HARQ: Hybrid Automatic Repeat Request
16. 16
Reduces latency and device
power consumption
MCS: Modulation and Coding Scheme; HARQ: Hybrid Automatic Repeat reQuest; BSR: Buffer Status Reporting
Rel-16 provides
better handling
of uplink XR
traffic
UL-CG
Slot
aggregation
Hand tracking
Video
Pose
Repetition in next 'U' slot
UL Configured Grant
(UL-CG)
Periodic resources for
low latency traffic
Uplink
skipping
Skip UL grants when
no data and save power
Slot
aggregation
Improves reliability
and reduces HARQ latencies
Example
AR uplink
traffic
UL
skipping
17. 17
Rel-17 Reduced Capability
(NR-Light)
can enable a category of
low-power AR glasses
Narrower bandwidths
20 MHz in sub-7
Lower order modulation
(256-QAM optional)
Half-duplex
No carrier aggregation or
dual connectivity
Fewer receive antennas
1-2 Rx
Source: RP-211574 (Support of reduced capability NR devices)
Reduced power
consumption
Lower device
complexity
Lower data rates
18. 18
Further improving XR experience
with 5G Advanced
Low latency mobility
Using L1/L2 signaling
for handoffs
QoS based
on multimedia
payload
Define QoS based on
PDU sets
Staggering UE
traffic arrivals at
gNodeB
Improved scheduler
Sleep after low
latency uplink
transmission
Retransmission-less
configured grant
Align transmission
to multimedia
cadence
Enhanced CDRX and
configured grant
Release-18
proposals
Lower latency
Lower power
Higher capacity
19. 19
Enhanced CDRX eliminates
drift between DRX-ON
and XR traffic
Enhanced CG eliminates
drift between CG and
XR traffic
Reduces latencies and
device power consumption
CG: Configured Grant; CDRX: Connected-mode Discontinuous Reception
Rel-18 aligns
transmission times
to the multimedia
cadence
CDRX cycle
8ms
CDRX cycle
9ms
Enhanced
CDRX cycle
Burst arrivals
(120Hz)
8.33 ms 8.33 ms 8.33 ms
8 ms 8 ms 8 ms
9 ms 9 ms 9 ms
Not aligned
to burst
Aligned
to burst
Enhanced CDRX example
20. 20
Retransmission-less
Configured Grant:
1. Uses conservative MCS to improve
reliability of first transmission
2. Avoids the need for the UE to
monitor control channel after CG
3. Allows longer sleep cycles
reducing device power
consumption
HARQ: Hybrid Automatic Repeat Request; MCS: Modulation and Coding Scheme; CG: Configured Grant
Rel-18 enables
devices to sleep
after uplink
transmission
Rel-18 device can sleep
Legacy device is ON for
potential retransmissions
HARQ retransmission timer
CG CG
HARQ retransmission timer
CG CG
Low-latency, low-bandwidth transmissions
(e.g., 100-byte pose)
21. 21
QoS defined for PDU-sets
QoS parameters include
error rate and delay
Enables RAN schedulers
to satisfy multimedia
QoS requirements
Rel-18 enables
QoS based on
multimedia traffic
payloads instead
of IP packets
XR traffic flow
PDU
set
XR burst 1 XR burst 2
PDU
set
PDU
set
PDU
set
PDU
set
Application
at server
RAN
scheduler
IP
packet
Group of IP packets from
application, such as a
slice of video frame
22. 22
Base station signals the user
burst arrival time offset to the
server
Server staggers user traffic
to the base station
Improves base station
scheduler
Rel-18 staggers
UE application
traffic arrivals from
the server to the
base station
Time
Traffic
User 1 User 2
UE application traffic staggering
User 1 User 2
No UE application traffic staggering
- Higher latencies
- Higher power /
device ON-time
- Lower capacity
User 3
+ Lower latencies
+ Lower power /
device ON-time
+ Higher capacity
Inefficient scheduler
Time
Traffic
Improved scheduler
23. 23
Rel-15 uses RRC (L3)
signaling, leading to handoff
delays
Rel-18 uses L1/L2 signaling
for fast Pcell updates,
reducing handoff delays and
extending coverage
RRC: Radio Resource Control; Pcell: Personal cell
Rel-18 improves
mobility
management
with better cell
handovers
Rel-18 (lower layer
mobility) L3 boundary
Rel-15
L3 boundary
PCI
PCI 2
PCI 3
24. 24
6G will enable the merging of our worlds
Physical, digital, virtual, immersive interactions taking human augmentation
to next level via ubiquitous, low-power joint communication and sensing
AI-native E2E communications
Data-driven communication and network
design, with joint training, model sharing
and distributed inference across networks
and devices
Air interface innovations
Waveform/coding from MHz to THz,
advanced duplexing, Giga-MIMO,
mmWave evolution, passive MIMO,
satellite comm., system energy
efficiency
Communications resiliency
Multifaceted trust and configurable security
post quantum security, robust networks
tolerant to failures and attacks
Spectrum expansion and sharing
Expanding to THz, wide-area expansion
to higher bands, new spectrum sharing
paradigm, dynamic coordination with
environmental awareness
Scalable network architecture
Disaggregation and virtualization at the
Connected Intelligent Edge, use of
advanced topologies to address
growing demand
Physical
world
Digital
world
Virtual
world
Metaverse
Spatial
computing
Digitization
25. 25
6G XR requirements fueled by digital twins and spatial compute
100x network
capacity
0.1-10 Gbps
per user
Digital twins digitize the complex physical world in the metaverse Spatial compute enables immersive interaction with 3D digital content
Use multiple
frequency bands
(sub-THz, mmW, sub 7GHz, 7-24GHz,
unlicensed, shared spectrum)
26. 26
6DoF/SLAM
Incorporating e2e QoS
and session handling
Delivering low power
and low latencies
Dynamically distributing
workloads
6G protocols can natively
support distributed compute
Hand Tracking
Facial Expression
Eye Tracking
3D Graphics Rendering
Scene Understanding
Objected Rec and Tracking
Camera Processing
<100mW
modem+RF
High performance
GPU/CPU
27. 27
27
SSB: Synchronization Signal Block
6G can fuse perception and wireless to improve performance
We demonstrated that perception inputs can improve mmWave beamforming accuracy
6G protocols can enable network and device perception services
6DoF pose and depth maps can be used to map reflectors and blockers, improving beamforming
Reflector
Blocker
Anchor
28. 28
Snapdragon and Snapdragon Spaces are products of Qualcomm Technologies, Inc. and/or its subsidiaries
Qualcomm Technologies is enabling the XR industry
Platforms Software & algorithms Reference designs Ecosystem
Hand
Tracking
6DoF/SLAM
Eye
Tracking
Pass Through
(video see through)
3D
Reconstruction
Object Rec
& Tracking
29. 29
Snapdragon powers key
XR platforms
Xiaomi Mi VR
ThirdEye X2 Nreal light
Microsoft
Hololens 2
HTC Vive
Focus Plus
iQiyi QiYu 2S
Rokid Glass
HTC Vive Cosmos DPVR P1 Pro
3Glasses X1
Shadow Creator
Jimo
60+
Devices launched
Realwear HMT-1
Shadow Creator
Shadow VR
Pic Neo
Oculus Quest 2
iQiyi 3
XRSpace Mova
Lenovo Mirage Solo
Pico G2 4K
Shadow Creator
Action One
Realmax Qian
Lenovo ThinkReality A6
Google Glass
Enterprise Edition 2
Oculus Quest
Ximmerse Rhino X Oculus Go
Lenovo
Think Reality A3
Vuzix M400
HTC VIVE Focus 3
Meta Quest Pro
Shadow Creator
HONGHU
30. 30
Snapdragon Spaces™ XR Developer Platform
Empowering developers to create immersive experience for AR Glasses
Qualcomm launches
up to $100M
Snapdragon™ Metaverse Fund
• Open platform and open ecosystem
• Paves the way to a new Frontier of Spatial Computing
31. 5G enables boundless XR at scale and
is ready for commercialization
Device APIs with 5G optimizations
enable richer user experience and
more efficient deployments
5G standards have a strong evolution
path for enhanced XR experiences,
leading into 6G.
We are making boundless XR possible
by enabling the industry with 5G
modem APIs, XR platforms, XR SDKs,
and XR reference designs
32. 32
Connect with us
Questions? qualcomm.com/5g
@QCOMResearch
qualcomm.com/news/onq
slideshare.net/qualcommwirelessevolution
.youtube.com/qualcomm