This presentation outlines the synergistic nature of 5G and AI -- two disruptive areas of innovations that can change the world. It illustrates the benefits of adopting AI for the advancements of 5G, as well as showcases the latest progress made by Qualcomm Technologies, Inc.
5G + AI: The Ingredients For Next Generation Wireless InnovationQualcomm Research
5G and AI are two of the most disruptive technologies the world has seen in decades. While each is individually revolutionizing industries and enabling new experiences, the combination of both 5G and AI is going to be truly transformative. Applying AI not only to the 5G network but also the device will lead to more efficient wireless communications, longer battery life and enhanced user experiences. The low latency and high capacity of 5G will also allow AI processing to be distributed amongst the device, edge cloud and central cloud, enabling flexible system solutions for a variety of use cases. At Qualcomm Technologies, we are not only working on cutting-edge research for 5G and AI, but we are also exploring their synergies to realize our vision of the future. View this presentation to learn how AI is making 5G better -- in the network and on the device, why on-device AI processing is essential, and how 5G is empowering distributed learning over wireless.
- There is a rich roadmap of 5G technologies coming in the second half of the 5G decade with the 5G Advanced evolution
- 6G will be the future innovation platform for 2030 and beyond building on the 5G Advanced foundation
- 6G will be more than just a new radio design, expanding the role of AI, sensing and others in the connected intelligent edge
- Qualcomm is leading cutting-edge wireless research across six key technology vectors on the path to 6G
5G Cellular Technology, Internet of Things, 5G, and IoT, The Evolution of 5G, 5G: A Paradigm Shift and Rethinking of Mobile Business, 5G Cellular Network Architecture, 5G working with 4G, Technology behind 5G, Millimeter Waves, 5G Core Network Architecture, Network Slice Definition, 5G Service-Based Architecture (SBA), 5G will enrich the Telecommunication Ecosystem, The Internet Of Things, EVOLUTION OF IOT, FOUR LAYER MODEL FOR IOT, Typical IoT Architecture, 5G + IoT: Ushering in a New Era, Impact of 5G on IoT, KEY TECHNOLOGIES WHICH ENABLE 5G–IoT, Wireless Network Function Virtualization.(WNFV), The architecture of 5G–IoT, Device to Device (D2D) Communication, 5G and IoT applications, Research Challenges for 5G, Future of IoT
Bringing AI research to wireless communication and sensingQualcomm Research
AI for wireless is already here, with applications in areas such as mobility management, sensing and localization, smart signaling and interference management. Recently, Qualcomm Technologies has prototyped the AI-enabled air interface and launched the Qualcomm 5G AI Suite. These developments are possible thanks to expertise in both wireless and machine learning from over a decade of foundational research in these complementing fields.
Our approach brings together the modeling flexibility and computational efficiency of machine learning and the out-of-domain generalization and interpretability of wireless domain expertise.
In this webinar, Qualcomm AI Research presents an overview of state-of-the-art research at the intersection of the two fields and offers a glimpse into the future of the wireless industry.
Qualcomm AI Research is an initiative of Qualcomm Technologies, Inc.
Speakers:
Arash Behboodi, Machine Learning Research Scientist (Senior Staff Engineer/Manager), Qualcomm AI Research Daniel Dijkman, Machine Learning Research Scientist (Principal Engineer), Qualcomm AI Research
3GPP Release 17: Completing the first phase of 5G evolutionQualcomm Research
This presentation summarizes 5G NR Release 17 projects that was completed in March 2022. It further enhances 5G foundation and expands into new devices, use cases, verticals.
5G + AI: The Ingredients For Next Generation Wireless InnovationQualcomm Research
5G and AI are two of the most disruptive technologies the world has seen in decades. While each is individually revolutionizing industries and enabling new experiences, the combination of both 5G and AI is going to be truly transformative. Applying AI not only to the 5G network but also the device will lead to more efficient wireless communications, longer battery life and enhanced user experiences. The low latency and high capacity of 5G will also allow AI processing to be distributed amongst the device, edge cloud and central cloud, enabling flexible system solutions for a variety of use cases. At Qualcomm Technologies, we are not only working on cutting-edge research for 5G and AI, but we are also exploring their synergies to realize our vision of the future. View this presentation to learn how AI is making 5G better -- in the network and on the device, why on-device AI processing is essential, and how 5G is empowering distributed learning over wireless.
- There is a rich roadmap of 5G technologies coming in the second half of the 5G decade with the 5G Advanced evolution
- 6G will be the future innovation platform for 2030 and beyond building on the 5G Advanced foundation
- 6G will be more than just a new radio design, expanding the role of AI, sensing and others in the connected intelligent edge
- Qualcomm is leading cutting-edge wireless research across six key technology vectors on the path to 6G
5G Cellular Technology, Internet of Things, 5G, and IoT, The Evolution of 5G, 5G: A Paradigm Shift and Rethinking of Mobile Business, 5G Cellular Network Architecture, 5G working with 4G, Technology behind 5G, Millimeter Waves, 5G Core Network Architecture, Network Slice Definition, 5G Service-Based Architecture (SBA), 5G will enrich the Telecommunication Ecosystem, The Internet Of Things, EVOLUTION OF IOT, FOUR LAYER MODEL FOR IOT, Typical IoT Architecture, 5G + IoT: Ushering in a New Era, Impact of 5G on IoT, KEY TECHNOLOGIES WHICH ENABLE 5G–IoT, Wireless Network Function Virtualization.(WNFV), The architecture of 5G–IoT, Device to Device (D2D) Communication, 5G and IoT applications, Research Challenges for 5G, Future of IoT
Bringing AI research to wireless communication and sensingQualcomm Research
AI for wireless is already here, with applications in areas such as mobility management, sensing and localization, smart signaling and interference management. Recently, Qualcomm Technologies has prototyped the AI-enabled air interface and launched the Qualcomm 5G AI Suite. These developments are possible thanks to expertise in both wireless and machine learning from over a decade of foundational research in these complementing fields.
Our approach brings together the modeling flexibility and computational efficiency of machine learning and the out-of-domain generalization and interpretability of wireless domain expertise.
In this webinar, Qualcomm AI Research presents an overview of state-of-the-art research at the intersection of the two fields and offers a glimpse into the future of the wireless industry.
Qualcomm AI Research is an initiative of Qualcomm Technologies, Inc.
Speakers:
Arash Behboodi, Machine Learning Research Scientist (Senior Staff Engineer/Manager), Qualcomm AI Research Daniel Dijkman, Machine Learning Research Scientist (Principal Engineer), Qualcomm AI Research
3GPP Release 17: Completing the first phase of 5G evolutionQualcomm Research
This presentation summarizes 5G NR Release 17 projects that was completed in March 2022. It further enhances 5G foundation and expands into new devices, use cases, verticals.
Get a better understanding of 5G in this "Introduction to 5G"presentation by Doug Hohulin, Nokia 4G/5G Mobile Technology, whose focus is the strategy and business development of AV, UAS, Smart City, IoT and 5G technologies. This was part of Doug's presentation at the 2017 Gigabit City Summit (GCS17)
While 6G is as yet 10 years away, organizations that keep themselves up to date with what this next networking architecture has to bring to the table will have a decisive advantage over their competitors.
While 5G commercialization is still in its initial stage, it's never too soon to begin planning for 6G on the grounds that it regularly takes around 10 years from the beginning of exploration to commercialization of new generation of communications technology.
Most tech insiders trust 6G would need to hit two or three benchmarks first beginning with a hyper-quick information rate that beats 5G, with download paces of at any rate 1,000Gbps (multiple times the speed of 5G) and record-breaking all-time low latency, or “air latency” of under 100 μs, start to finish (E2E) inertness under 1 ms, and amazingly low delay jitter in order of microseconds.
Technology doesn't rest.
Despite the fact that the 5G Technology is still in the beginning of its arrangement, the tech world is now bantering around thoughts on what the next generation – 6G – might resemble.
This isn't startling on the grounds that the technology that goes into 5G's replacement will set aside some effort to create.
Tonex offers 6G Introduction, IMT-2030, a one-day course covering the planning inspiration and basic technology of 6G engineering, just like the new 6G terminology. Members are familiar with the comparison between 5G and 6G, and understand how 6G achieves its goals by understanding the functions of 6G.
Introduction to 6G Course by Tonex
IMT-2030 is an introduction to 6G, a one-day overview of 6G technology consistent with ITU-T IMT-2030. Learn about 6G wireless systems, use cases, applications, trends, technologies and protocols.
6G or IMT-2030 is the future of mobile networks promised by ITU-T network 2030. Tonex now offers training courses to help develop the next generation of 6G skills.
Who Should Attend
This one-day training course covers the design motivation and basic technology of 6G architecture, as well as new 6G vocabulary. You will also understand the difference between 5G and 6G, and understand how 6G will achieve its goals by observing how 6G works.
An advanced 6G technical overview for anyone involved in 6G product development, product management, analysis, planning, design and engineering.
Learning Objectives
Describe the 6G vision and business case Explain the key technologies and basic components of 6G networks
Draw end-to-end 6G network architecture, including new radio types, core networks and applications
Gradually complete the evolution from 5G to 6G
Course outline
Overview of 6G Wireless Networks
6G Vision, Architecture, and Key Technologies
Hologram Type Communications
Learn More
Introduction to 6G, Prepare Now Training
https://www.tonex.com/introduction-to-6g-prepare-now-training/
Mavenir: Why and How Private LTE & 5G Networks Are Rapidly Evolving for Enter...Mavenir
Dean Bubley, Founder of Disruptive Analysis and well known industry analyst, and Aniruddho Basu, Mavenir SVP/GM of Global Emerging Business, showcase the future of Private LTE & 5G Networks. Presentation from the "Why and How Private LTE & 5G Networks Are Rapidly Evolving for Enterprises" webinar.
5G will transform the IoT, expanding the reach of 5G and mobile technologies beyond smartphones. This presentation talks about how 5G is opening doors to new use cases, what is in the 5G evolution that will address the expanding IoT needs, and what Qualcomm is doing to deliver end-to-end technologies and solutions.
Ericsson brings new updates to its 5G platform. Introducing 5G network services to support operators from preparation to 5G launch. Ericsson 5G services roadmap spans across three distinct phases, Prepare, Mobilize and Launch. Through our service offerings, Operators can now evolve their 4G network and smoothly start introducing 5G, reaching new heights on their journey to 5G.
5G is designed to serve an unprecedented range of capabilities with a single global standard. With enhanced mobile broadband (eMBB), massive IoT (mIoT), and mission-critical IoT, the three pillars of 5G represent extremes in performance and associated complexity. For IoT services, NB-IoT and eMTC devices prioritize low power consumption and the lowest complexity for wide-area deployments (LPWA), while enhanced ultra-reliable, low-latency communication (eURLLC), along with time-sensitive networking (TSN), delivers the most stringent use case requirements. But there exists an opportunity to more efficiently address a broad range of mid-tier applications with capabilities ranging between these extremes.
In 5G NR Release 17, 3GPP introduced a new tier of reduced capability (RedCap) devices, also known as NR-Light. It is a new device platform that bridges the capability and complexity gap between the extremes in 5G today with an optimized design for mid-tier use cases. With the recent standards completion, NR-Light is set to efficiently expand the 5G universe to connect new frontiers.
Download this presentation to learn:
• What NR-Light is and why it can herald the next wave of 5G expansion
• How NR-Light is accelerating the growth of the connected intelligent edge
• Why NR-Light is a suitable 5G migration path for mid-tier LTE devices
Cellular networks have facilitated positioning in addition to voice or data communications from the beginning, since 2G, and we’ve since grown to rely on positioning technology to make our lives safer, simpler, more productive, and even fun. Cellular positioning complements other technologies to operate indoors and outdoors, including dense urban environments where tall buildings interfere with satellite positioning. It works whether we’re standing still, walking, or in a moving vehicle. With 5G, cellular positioning breaks new ground to bring robust precise positioning indoors and outdoors, to meet even the most demanding Industry 4.0 needs.
As we look to the future, the Connected Intelligent Edge will bring a new dimension of positional insight to a broad range of devices, improving wireless use cases still under development. We’re already charting the course to 5G Advanced and beyond by working on the evolution of cellular positioning technology to include RF sensing for situational awareness.
Download the deck to learn more.
How will sidelink bring a new level of 5G versatility.pdfQualcomm Research
Today, the 5G system mainly operates on a network-to-device communication model, exemplified by enhanced mobile broadband use cases where all data transmissions are between the network (i.e., base station) and devices (e.g., smartphone). However, to fully deliver on the original 5G vision of supporting diverse devices, services, and deployment scenarios, we need to expand the 5G topology further to reach new levels of performance and efficiency.
That is why sidelink communication was introduced in 3GPP standards, designed to facilitate direct communication between devices, independent of connectivity via the cellular infrastructure. Beyond automotive communication, it also benefits many other 5G use cases such as IoT, mobile broadband, and public safety.
Get a better understanding of 5G in this "Introduction to 5G"presentation by Doug Hohulin, Nokia 4G/5G Mobile Technology, whose focus is the strategy and business development of AV, UAS, Smart City, IoT and 5G technologies. This was part of Doug's presentation at the 2017 Gigabit City Summit (GCS17)
While 6G is as yet 10 years away, organizations that keep themselves up to date with what this next networking architecture has to bring to the table will have a decisive advantage over their competitors.
While 5G commercialization is still in its initial stage, it's never too soon to begin planning for 6G on the grounds that it regularly takes around 10 years from the beginning of exploration to commercialization of new generation of communications technology.
Most tech insiders trust 6G would need to hit two or three benchmarks first beginning with a hyper-quick information rate that beats 5G, with download paces of at any rate 1,000Gbps (multiple times the speed of 5G) and record-breaking all-time low latency, or “air latency” of under 100 μs, start to finish (E2E) inertness under 1 ms, and amazingly low delay jitter in order of microseconds.
Technology doesn't rest.
Despite the fact that the 5G Technology is still in the beginning of its arrangement, the tech world is now bantering around thoughts on what the next generation – 6G – might resemble.
This isn't startling on the grounds that the technology that goes into 5G's replacement will set aside some effort to create.
Tonex offers 6G Introduction, IMT-2030, a one-day course covering the planning inspiration and basic technology of 6G engineering, just like the new 6G terminology. Members are familiar with the comparison between 5G and 6G, and understand how 6G achieves its goals by understanding the functions of 6G.
Introduction to 6G Course by Tonex
IMT-2030 is an introduction to 6G, a one-day overview of 6G technology consistent with ITU-T IMT-2030. Learn about 6G wireless systems, use cases, applications, trends, technologies and protocols.
6G or IMT-2030 is the future of mobile networks promised by ITU-T network 2030. Tonex now offers training courses to help develop the next generation of 6G skills.
Who Should Attend
This one-day training course covers the design motivation and basic technology of 6G architecture, as well as new 6G vocabulary. You will also understand the difference between 5G and 6G, and understand how 6G will achieve its goals by observing how 6G works.
An advanced 6G technical overview for anyone involved in 6G product development, product management, analysis, planning, design and engineering.
Learning Objectives
Describe the 6G vision and business case Explain the key technologies and basic components of 6G networks
Draw end-to-end 6G network architecture, including new radio types, core networks and applications
Gradually complete the evolution from 5G to 6G
Course outline
Overview of 6G Wireless Networks
6G Vision, Architecture, and Key Technologies
Hologram Type Communications
Learn More
Introduction to 6G, Prepare Now Training
https://www.tonex.com/introduction-to-6g-prepare-now-training/
Mavenir: Why and How Private LTE & 5G Networks Are Rapidly Evolving for Enter...Mavenir
Dean Bubley, Founder of Disruptive Analysis and well known industry analyst, and Aniruddho Basu, Mavenir SVP/GM of Global Emerging Business, showcase the future of Private LTE & 5G Networks. Presentation from the "Why and How Private LTE & 5G Networks Are Rapidly Evolving for Enterprises" webinar.
5G will transform the IoT, expanding the reach of 5G and mobile technologies beyond smartphones. This presentation talks about how 5G is opening doors to new use cases, what is in the 5G evolution that will address the expanding IoT needs, and what Qualcomm is doing to deliver end-to-end technologies and solutions.
Ericsson brings new updates to its 5G platform. Introducing 5G network services to support operators from preparation to 5G launch. Ericsson 5G services roadmap spans across three distinct phases, Prepare, Mobilize and Launch. Through our service offerings, Operators can now evolve their 4G network and smoothly start introducing 5G, reaching new heights on their journey to 5G.
5G is designed to serve an unprecedented range of capabilities with a single global standard. With enhanced mobile broadband (eMBB), massive IoT (mIoT), and mission-critical IoT, the three pillars of 5G represent extremes in performance and associated complexity. For IoT services, NB-IoT and eMTC devices prioritize low power consumption and the lowest complexity for wide-area deployments (LPWA), while enhanced ultra-reliable, low-latency communication (eURLLC), along with time-sensitive networking (TSN), delivers the most stringent use case requirements. But there exists an opportunity to more efficiently address a broad range of mid-tier applications with capabilities ranging between these extremes.
In 5G NR Release 17, 3GPP introduced a new tier of reduced capability (RedCap) devices, also known as NR-Light. It is a new device platform that bridges the capability and complexity gap between the extremes in 5G today with an optimized design for mid-tier use cases. With the recent standards completion, NR-Light is set to efficiently expand the 5G universe to connect new frontiers.
Download this presentation to learn:
• What NR-Light is and why it can herald the next wave of 5G expansion
• How NR-Light is accelerating the growth of the connected intelligent edge
• Why NR-Light is a suitable 5G migration path for mid-tier LTE devices
Cellular networks have facilitated positioning in addition to voice or data communications from the beginning, since 2G, and we’ve since grown to rely on positioning technology to make our lives safer, simpler, more productive, and even fun. Cellular positioning complements other technologies to operate indoors and outdoors, including dense urban environments where tall buildings interfere with satellite positioning. It works whether we’re standing still, walking, or in a moving vehicle. With 5G, cellular positioning breaks new ground to bring robust precise positioning indoors and outdoors, to meet even the most demanding Industry 4.0 needs.
As we look to the future, the Connected Intelligent Edge will bring a new dimension of positional insight to a broad range of devices, improving wireless use cases still under development. We’re already charting the course to 5G Advanced and beyond by working on the evolution of cellular positioning technology to include RF sensing for situational awareness.
Download the deck to learn more.
How will sidelink bring a new level of 5G versatility.pdfQualcomm Research
Today, the 5G system mainly operates on a network-to-device communication model, exemplified by enhanced mobile broadband use cases where all data transmissions are between the network (i.e., base station) and devices (e.g., smartphone). However, to fully deliver on the original 5G vision of supporting diverse devices, services, and deployment scenarios, we need to expand the 5G topology further to reach new levels of performance and efficiency.
That is why sidelink communication was introduced in 3GPP standards, designed to facilitate direct communication between devices, independent of connectivity via the cellular infrastructure. Beyond automotive communication, it also benefits many other 5G use cases such as IoT, mobile broadband, and public safety.
• Overview of Ericsson’s product or service and its functionality
• How it promotes connectivity
• Why this product or service will make the lives of millions easier
• Key features and benefits
https://www.ericsson.com/en/portfolio
** This portfolio was created by me as part of my application for a position at Ericsson. Unfortunately, due to personal circumstances, I was unable to proceed to the final viva event. However, I believe that this PPT can serve as an example for anyone seeking to create a portfolio for Ericsson. I hope it proves to be helpful and insightful for those who are interested.
Thank You
Keynote Talk on Recent Advances in Mobile Grid and Cloud ComputingSayed Chhattan Shah
Due to recent advances in mobile computing and networking technologies it has become feasible to integrate various mobile devices such as robots, aerial vehicles, sensors, and smart phones with grid and cloud computing systems. This integration enables design and development of next generation of applications through sharing of computing resources in mobile environments and also introduces several challenges due to dynamic and unpredictable network.
In this talk, we will discuss applications and challenges involved in design and development of mobile grid and cloud computing systems, cloud robots, and innovative architectures for creating energy efficient and robust mobile cloud.
This is the second part of my 5G project which consists of the architecture structure of 5G. What all it consists of, what all technologies it uses, what all layers it contain etc.
We are looking at the future of technology that has the structure of Nanocore, beneficial for future applications.
We envision a world where devices, machines, automobiles, and things are much more intelligent, simplifying and enriching our daily lives. They will be able to perceive, reason, and take intuitive actions based on awareness of the situation, improving just about any experience and solving problems that to this point we’ve either left to the user, or to more conventional algorithms.
Artificial intelligence (AI) is the technology driving this revolution. You may think that AI is really about big data and the cloud, and yet Qualcomm’s solutions already have the power, thermal, and processing efficiency to run powerful AI algorithms on the actual device. Our current products now support many AI use cases, such as computer vision, natural language processing, and malware detection — both for smartphones and autos — and we are researching broader topics, such as AI for wireless connectivity, power management, and photography. View this presentation to learn about our AI vision, including:
Why mobile is becoming the pervasive AI platform
The benefits of AI moving to the device and complementing the cloud
The benefits of distributed processing for AI
Qualcomm’s long history of AI research and development
What the future of AI processing might look like
Unofficial guide to Nokia 5G Associate Certification. Just some exam notes I made for the certification, I would recommend Nokia extensive online training to all.
BL0 100 Nokia 5G Foundation Exam
5th generation mobile networks or 5th generation wireless systems, abbreviated 5G, are the proposed next telecommunications standards beyond the current 4G/IMT-Advanced standards.
An initial chip design by Qualcomm in October 2016, the Snapdragon X50 5G modem, supports operations in the 28 GHz band, also known as millimetre wave (mmW) spectrum. With 800 MHz bandwidth support, it is designed to support peak download speeds of up to 35.46 gigabits per second.
5G planning aims at higher capacity than current 4G, allowing a higher density of mobile broadband users, and supporting device-to-device, ultra reliable, and massive machine communications.
5G research and development also aims at lower latency than 4G equipment and lower battery consumption, for better implementation of the Internet of things
INTERNET OF THINGS
. The Internet of Things (IoT) is a system of interrelated computing devices, mechanical and digital machines, objects, animals or people that are provided with unique identifiers and the ability to transfer data over a network without requiring human-to-human or human-to-computer interaction
Wi-Fi fingerprinting-based floor detection using adaptive scaling and weighte...CSITiaesprime
In practical applications, accurate floor determination in multi-building/floor environments is particularly useful and plays an increasingly crucial role in the performance of location-based services. An accurate and robust building and floor detection can reduce the location search space and ameliorate the positioning and wayfinding accuracy. As an efficient solution, this paper proposes a floor identification method that exploits statistical properties of wireless access point propagated signals to exponent received signal strength (RSS) in the radio map. Then, using single-layer extreme learning machine-weighted autoencoder (ELM-WAE) main feature extraction and dimensional reduction is implemented. Finally, ELM based classifier is trained over a new feature space to determine floor level. For the efficiency evaluation of our proposed model, we utilized three different datasets captured in the real scenarios. The evaluation result shows that the proposed model can achieve state-of-art performance and improve the accuracy of floor detection compared with multiple recent techniques. In this way, the floor level can be identified with 97.30%, 95.32%, and 96.39% on UJIIndoorLoc, Tampere, and UTSIndoorLoc datasets, respectively.
Similar to The essential role of AI in the 5G future (20)
Generative AI models, such as ChatGPT and Stable Diffusion, can create new and original content like text, images, video, audio, or other data from simple prompts, as well as handle complex dialogs and reason about problems with or without images. These models are disrupting traditional technologies, from search and content creation to automation and problem solving, and are fundamentally shaping the future user interface to computing devices. Generative AI can apply broadly across industries, providing significant enhancements for utility, productivity, and entertainment. As generative AI adoption grows at record-setting speeds and computing demands increase, on-device and hybrid processing are more important than ever. Just like traditional computing evolved from mainframes to today’s mix of cloud and edge devices, AI processing will be distributed between them for AI to scale and reach its full potential.
In this presentation you’ll learn about:
- Why on-device AI is key
- Full-stack AI optimizations to make on-device AI possible and efficient
- Advanced techniques like quantization, distillation, and speculative decoding
- How generative AI models can be run on device and examples of some running now
- Qualcomm Technologies’ role in scaling on-device generative AI
As generative AI adoption grows at record-setting speeds and computing demands increase, hybrid processing is more important than ever. But just like traditional computing evolved from mainframes and thin clients to today’s mix of cloud and edge devices, AI processing must be distributed between the cloud and devices for AI to scale and reach its full potential. In this talk you’ll learn:
• Why on-device AI is key
• Which generative AI models can run on device
• Why the future of AI is hybrid
• Qualcomm Technologies’ role in making hybrid AI a reality
Qualcomm Webinar: Solving Unsolvable Combinatorial Problems with AIQualcomm Research
How do you find the best solution when faced with many choices? Combinatorial optimization is a field of mathematics that seeks to find the most optimal solutions for complex problems involving multiple variables. There are numerous business verticals that can benefit from combinatorial optimization, whether transport, supply chain, or the mobile industry.
More recently, we’ve seen gains from AI for combinatorial optimization, leading to scalability of the method, as well as significant reductions in cost. This method replaces the manual tuning of traditional heuristic approaches with an AI agent that provides a fast metric estimation.
In this presentation you will find out:
Why AI is crucial in combinatorial optimization
How it can be applied to two use cases: improving chip design and hardware-specific compilers
The state-of-the-art results achieved by Qualcomm AI Research
3D perception is crucial for understanding the real world. It offers many benefits and new capabilities over 2D across diverse applications, from XR and autonomous driving to IOT, camera, and mobile. 3D perception with machine learning is creating the new state of the art (SOTA) in areas, such as depth estimation, object detection, and neural scene representation. Making these SOTA neural networks feasible for real-world deployment on mobile devices constrained by power, thermal, and performance has been a challenge. Qualcomm AI Research has developed not only novel AI techniques for 3D perception but also full-stack AI optimizations to enable real-world deployments and energy-efficient solutions. This presentation explores the latest research that is enabling efficient 3D perception while maintaining neural network model accuracy. You’ll learn about:
- The advantages of 3D perception over 2D and the need for 3D perception across applications
- Advancements in 3D perception research by Qualcomm AI Research
- Our future 3D perception research directions
5G is going mainstream across the globe, and this is an exciting time to harness the low latency and high capacity of 5G to enable the metaverse. A distributed-compute architecture across device and cloud can enable rich extended reality (XR) user experiences. Virtual reality (VR) and mixed reality (MR) are ready for deployment in private networks, while augmented reality (AR) for wide area networks can be enabled in the near term with Wi-Fi powered AR glasses paired with a 5G-enabled phone. Device APIs enabling application adaptation is critical for good user experience. 5G standards are evolving to support the deployment of AR glasses at a large scale and setting the stage for 6G-era with the merging of the physical, digital, and virtual worlds. Techniques like perception-enhanced wireless offer significant potential to improve user experience. Qualcomm Technologies is enabling the XR industry with platforms, developer SDKs, and reference designs.
Check out this webinar to learn:
• How 5G and distributed-compute architectures enable the metaverse
• The latest results from our boundless XR 5G/6G testbed, including device APIs and perception-enhanced wireless
• 5G standards evolution for enhancing XR applications and the road to 6G
• How Qualcomm Technologies is enabling the industry with platforms, SDKs, and reference designs
AI model efficiency is crucial for making AI ubiquitous, leading to smarter devices and enhanced lives. Besides the performance benefit, quantized neural networks also increase power efficiency for two reasons: reduced memory access costs and increased compute efficiency.
The quantization work done by the Qualcomm AI Research team is crucial in implementing machine learning algorithms on low-power edge devices. In network quantization, we focus on both pushing the state-of-the-art (SOTA) in compression and making quantized inference as easy to access as possible. For example, our SOTA work on oscillations in quantization-aware training that push the boundaries of what is possible with INT4 quantization. Furthermore, for ease of deployment, the integer formats such as INT16 and INT8 give comparable performance to floating point, i.e., FP16 and FP8, but have significantly better performance-per-watt performance. Researchers and developers can make use of this quantization research to successfully optimize and deploy their models across devices with open-sourced tools like AI Model Efficiency Toolkit (AIMET).
Presenters: Tijmen Blankevoort and Chirag Patel
Realizing mission-critical industrial automation with 5GQualcomm Research
Manufacturers seeking better operational efficiencies, with reduced downtime and higher yield, are at the leading edge of the Industry 4.0 transformation. With mobile system components and reliable wireless connectivity between them, flexible manufacturing systems can be reconfigured quickly for new tasks, to troubleshoot issues, or in response to shifts in supply and demand.
There is a long history of R&D collaboration between Bosch Rexroth and Qualcomm Technologies for the effective application of these 5G capabilities to industrial automation use cases. At the Robert Bosch Elektronik GmbH factory in Salzgitter, Germany, this collaboration has reached new heights.
Download this deck to learn how:
• Qualcomm Technologies and Bosch Rexroth are collaborating to accelerate the Industry 4.0 transformation
• 5G technologies deliver key capabilities for mission-critical industrial automation
• Distributed control solutions can work effectively across 5G TSN networks
• A single 5G technology platform solves connectivity and positioning needs for flexible manufacturing
AI firsts: Leading from research to proof-of-conceptQualcomm Research
AI has made tremendous progress over the past decade, with many advancements coming from fundamental research from many decades ago. Accelerating the pipeline from research to commercialization has been daunting since scaling technologies in the real world faces many challenges beyond the theoretical work done in the lab. Qualcomm AI Research has taken on the task of not only generating novel AI research but also being first to demonstrate proof-of-concepts on commercial devices, enabling technology to scale in the real world. This presentation covers:
The challenges of deploying cutting-edge research on real-world mobile devices
How Qualcomm AI Research is solving system and feasibility challenges with full-stack optimizations to quickly move from research to commercialization
Examples where Qualcomm AI Research has had industrial or academic firsts
Setting off the 5G Advanced evolution with 3GPP Release 18Qualcomm Research
In December 2021, 3GPP has reached a consensus on the scope of 5G NR Release 18. This is a significant milestone marking the beginning of 5G Advanced — the second wave of wireless innovations that will fulfill the 5G vision. Release 18 will build on the solid foundation set by Releases 15, 16, and 17, and it sets the longer-term evolution direction of 5G and beyond. This release will encompass a wide range of new and enhancement projects, ranging from improved MIMO and application of AI/ML-enabled air interface to extended reality optimizations and broader IoT support.
The need for intelligent, personalized experiences powered by AI is ever-growing. Our devices are producing more and more data that could help improve our AI experiences. How do we learn and efficiently process all this data from edge devices while maintaining privacy? On-device learning rather than cloud training can address these challenges. In this presentation, we’ll discuss:
- Why on-device learning is crucial for providing intelligent, personalized experiences without sacrificing privacy
- Our latest research in on-device learning, including few-shot learning, continuous learning, and federated learning
- How we are solving system and feasibility challenges to move from research to commercialization
Data compression has increased by leaps and bounds over the years due to technical innovation, enabling the proliferation of streamed digital multimedia and voice over IP. For example, a regular cadence of technical advancement in video codecs has led to massive reduction in file size – in fact, up to a 1000x reduction in file size when comparing a raw video file to a VVC encoded file. However, with the rise of machine learning techniques and diverse data types to compress, AI may be a compelling tool for next-generation compression, offering a variety of benefits over traditional techniques. In this presentation we discuss:
- Why the demand for improved data compression is growing
- Why AI is a compelling tool for compression in general
- Qualcomm AI Research’s latest AI voice and video codec research
- Our future AI codec research work and challenges
Artificial Intelligence (AI), specifically deep learning, is revolutionizing industries, products, and core capabilities by delivering dramatically enhanced experiences. However, the deep neural networks of today use too much memory, compute, and energy. To make AI truly ubiquitous, it needs to run on the end device within tight power and thermal budgets. Advancements in multiple areas are necessary to improve AI model efficiency, including quantization, compression, compilation, and neural architecture search (NAS). In this presentation, we’ll discuss:
- Qualcomm AI Research’s latest model efficiency research
- Our new NAS research to optimize neural networks more easily for on-device efficiency
- How the AI community can take advantage of this research though our open-source projects, such as the AI Model Efficiency Toolkit (AIMET) and AIMET Model Zoo
How to build high performance 5G networks with vRAN and O-RANQualcomm Research
5G networks are poised to deliver an unprecedented amount of data from a richer set of use cases than we have ever seen. This makes efficient networking in terms of scalability, cost, and power critical for the sustainable growth of 5G. Cloud technologies such as virtualization, containerization and orchestration are now powering a surge of innovation in virtualized radio access network (vRAN) infrastructure with modular hardware and software components, and standardized interfaces. While commercial off-the-shelf (COTS) hardware platforms provide the compute capacity for running vRAN software, hardware accelerators will also play a major role in offloading real-time and complex signal processing functions. Together, COTS platforms and hardware accelerators provide the foundation for building the intelligent 5G network and facilitate innovative new use cases with the intelligent wireless edge.
This presentation takes a look at the technology roadmap for 5G NR millimeter wave (mmWave). Including features such as integrated access and backhaul (IAB), enhancements in beam management, mobility, coverage, and more. For more information, please visit www.qualcomm.com/mmwave
Video data is abundant and being generated at ever increasing rates. Analyzing video with AI can provide valuable insights and capabilities for many applications ranging from autonomous driving and smart cameras to smartphones and extended reality. However, as video resolution and frame rates increase while AI video perception models become more complex, running these workloads in real time is becoming more challenging. This presentation explores the latest research that is enabling efficient video perception while maintaining neural network model accuracy. You’ll learn about:
- How video perception is crucial for understanding the world and making devices smarter
- The challenges of on-device real-time video perception at high resolution through AI
- Qualcomm AI Research’s latest research and techniques for efficient video perception
Checkout: https://www.qualcomm.com/AI
Enabling the rise of the smartphone: Chronicling the developmental history at...Qualcomm Research
Today’s smartphones are a marvel of modern technology — handheld devices with vast computing power, incredible multimedia and AI capabilities, and blazing fast data rates that support mobile browsing, social media interaction, and more. From humble beginnings as a cellphone focused purely on voice communication, the capability and functionality of modern smartphones have advanced tremendously. This presentation chronicles Qualcomm’s role in the rise of the smartphone from its initial beginnings to becoming the largest computing platform in the world. It includes:
- Key technology developments that led to today’s smartphones
- The role of Moore’s Law in driving new innovations and additional integration into mobile processors
- Qualcomm’s critical role in advancing the smartphone’s capabilities through groundbreaking innovations and key technology developments
This presentation provides an overview of important 5G innovations around new and enhanced use of spectrum. It also captures the current 5G spectrum status across the globe.
Transforming enterprise and industry with 5G private networksQualcomm Research
The 3GPP put the spotlight on industry expansion in July with 5G NR Release 16 and set the stage for enterprise and industry verticals to look at how to provide high-performance wireless connectivity with 5G private networks. With a variety of options for spectrum, different network architectures, a rich feature set to meet the demanding needs of the industrial Internet of Things (IIoT), and the privacy and security required for business assurance, 5G private networks are poised to transform enterprise and industry.
Watch the webinar at: https://pages.questexnetwork.com/Webinar-Qualcomm-Registration-101520.html?source=Qualcomm
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
The essential role of AI in the 5G future
1. San Diego, CA
September 2021 @QCOMResearch
The essential role
of AI in the 5G future
+ AI
How machine learning is accelerating wireless
innovations in the new decade and beyond
2. 2
The essential role of AI in the 5G future
+ AI
5G and AI are two
synergistic, essential
ingredients that are
fueling future
innovations
Applying AI to solve
difficult wireless
challenges and
deliver new values
AI plays an
expanding role in
the evolution of 5G
towards 6G
3. 3
better
better
Unifying connectivity fabric that can efficiently
connect virtually everything around us
Extreme capacity
Multi-Gbps speeds
Ultra-high reliability and low latency
Robust end-to-end security and privacy
New and diverse services, spectrum, deployments
AI
Contextual awareness
Personalization at scale
Intelligent, intuitive, and automated actions
Continual improvement through self-learning
Solving seemingly impossible-to-model problems
Learning platform that can make virtually
everything around us intelligent
Two synergistic and essential ingredients fueling future innovations
5G
4. 4
5G
Advancement in AI is making
Elevated level of performance
More efficient resource utilization
Energy reduction for longer battery life
Personalized security and privacy
Continuous enhancements over time
New and enhanced system capabilities
AI
Responsive user experiences and services
Lifelong learning
Flexibility for distributed functionality across devices
On-device intelligence assisted by cloud
Scale intelligence through distributed learning
Massive data aggregation for improved AI models
Proliferation of 5G is making
5G and AI are working together to accelerate innovations
better better
6. 6
Public network
Private networks
Transformation of the
Connected Intelligent
Edge has begun at scale
Edge cloud
Cloud On-device
Past
Cloud-centric AI
AI training and
inference in the
central cloud
Future
Fully-distributed AI
With lifelong on-
device learning
Today
Partially-distributed AI
Power-efficient
on-device AI inference
Processing data closer to devices at the edge derives
new system values (e.g., lower latency, enhanced privacy)
8. 8
Federated learning brings on-device learning to new level
Locally adapt once to a few samples
(e.g., few shot learning) or continuously
(e.g., unsupervised learning)
Aggregate model updates across
multiple users to globally improve model
from more diverse data
Offline learning
Data
On-device learning
Training on
local data
Federated learning
Model
updates
Coordinator
Worker Worker
5G
connectivity
Training on
local data
Local adaptation Global adaptation
Offline training prior to deployment
9. 9
Our research focus to improve 5G with AI
With a cloud-to-network-to-device approach for data collection and learning
Distributed Cloud
Central and edge clouds
For end-to-end service
optimization
Disaggregated Network
Virtualized and disaggregated RAN
For network and device
optimization
Edge Devices
On-device intelligence
For local device
optimization
Steps towards enabling AI/ML cloud and device platforms
Shorter Term
Continue to define data
collection for new use
cases hosted at the RAN
Longer Term
Joint cloud, core, RAN,
and device AI/ML
functions
Medium Term
Enabling jointly optimized
AI/ML use cases between
RAN functions and device
11. 11
AI-enhanced
wireless communications
Applying AI to solve difficult wireless challenges
Deep wireless domain knowledge is required to optimally use AI capabilities
Wireless challenges
Hard-to-model problems
Computational
infeasibility
of optimal solution
Efficient modem
parameter optimization
Dealing with non-linearity
AI strengths
Determining appropriate
representations for hard-to-
model problems
Finding near-ideal and
computationally realizable
solutions
Modeling non-linear functions
12. 12
Enhanced device experience
More intelligent beamforming and power management improve
throughput, robustness, and battery life
Improved system performance
On-device inference reduces network data traffic for more efficient
mobility and spectrum utilization
Better radio security
Detecting and defending against malicious base station spoofing
and jamming with fingerprinting
Radio awareness
Environmental and contextual sensing that reduces
access overhead and latency
On-device AI improves the 5G end-to-end system
13. 13
achieved by advanced on-device
AI algorithms
Spectrum sensing and access
Predict activities of other devices for more efficient
access and better scheduling to improve 5G
system performance
Environment (RF) sensing
Detect gestures, movements, and objects by
monitoring signal reflection patterns to enable
new use cases
Radio awareness
Contextual awareness
Use device context (e.g., position, velocity,
or in-car) derived from RF, sensors, traffic
activities to improve device experience
14. 14
Better beam management
Incorporate location, velocity, other aspects
of environmental and application awareness
to improve robustness and throughput
More power saving
Optimize performance/power consumption
tradeoffs by taking advantage of better
contextual awareness on device
Action
Select Tx/
Rx beams
Reward
Throughput,
power savings
Deep reinforcement
learning example
Agent
Environment
State
M strongest
RSRPs
On-device AI
enhances 5G
device experience
On-device AI
15. 15
Better link adaptation
Position-aware interference prediction can improve
overall system throughput and spectral efficiency
More seamless mobility
Device-centric mobility utilizes on-device
AI and sensors to predict handovers
Reduced network loading
On-device AI inference reduces the amount of
raw data needed be sent across the network
On-device AI
improves 5G system performance
AI inference result
17. 17
Applying AI
for enhanced
5G air interface
efficiency
Example: for uplink transmissions
1 Channel State Information
Improving system spectral efficiency
Implementing a neural network framework for CSI1
on non-linear temporal encoding and decoding
Decoder at the base station
Data
or control
channel
Reconstructed
downlink channel
estimates
CSI
decoder
Encoder at the device
Downlink
channel
estimates
CSI
encoder
Data
or control
channel
Improving device power efficiency
Allowing receiver to recover signal from a
device operating in the non-linear PA region
Optimizing transmit waveform to reduce
peak-to-average power ratio (PAPR)
19. 19
1. For example, Observed Time Difference of Arrival (OTDOA), Multiple Round Trip Time (Multi-RTT), Angle of Arrival (AoA)
More accurate device positioning
Learning device position over time without prior
knowledge with RF sensing — complementing
existing positioning methodologies1
Blockage
RF sensing
Multi-path detection
NLOS
LOS
Applying AI for contextual awareness and
environmental sensing
Motion and gesture detection
Sensing changes in environment to infer location
and type of motion for a wide range of use cases
(e.g., vital sign tracking, fall detection)
20. 20
5G positioning is supplemented by
various assisting information, such
as GNSS, multi-path profiles, and
other sensors
AI/ML for
enhanced 5G
positioning
performance
Positioning measurements
(e.g., RTT + AoA)
GNSS information
Sensor information
(e.g., accelerometer)
Channel multi-path profile
(e.g., NLOS)
Intelligent
Location Server
Neural Network
Improved device
position estimate
22. 22
Demo: Wireless AI-assisted indoor positioning
Leveraging unsupervised/weakly supervised learning — also applicable to 5G RF sensing (e.g., for positioning, motion and gesture detection)
23. 23
AI enables intelligent 5G network management
Enhanced service quality
Better mobility management,
user localization, and user behavior
and demand prediction
Higher network efficiency
More efficient scheduling, radio
resource utilization, congestion
control and routing
Simplified deployment
More capable Self Organizing
Networks (SON) for e.g., mmWave
network densification
Improved network security
More effective detection and defense
against malicious attacks by analyzing a
massive quantity of data
24. 24
A more intelligent way
to deploy 5G mmWave
Create a digital twin of targeted deployment
based on readily available sources/databases
(e.g., Google Street/Aerial view, GIS, …)
Reduce model complexity and balance tradeoffs
through ML-based object recognition, clustering,
and pruning
Optimize mmWave network deployment to
provide focused capacity using diverse existing
and new infrastructure (e.g., repeater, IAB,...)
Smart 5G mmWave
densification
25. 25
Distributed topology enables more efficient deployments
Standardization in e.g., 3GPP, O-RAN Alliance
Diversity of interconnectivity
e.g., fiber, out-of-band wireless, …
Diversity of node types e.g.,
small cells, IAB, repeaters, …
Many potential radio locations
e.g., for different objectives
Centralized
coordination
Fiber
backhaul
Integrated Access/
Backhaul (IAB)
Smart Repeater
Simple Repeater
ORAN Radio Unit
In-band or
out of-band wireless
Fiber, in-band or
out-of-band wireless fronthaul
Fiber
fronthaul
Small Cell
31. 31
Advancing 5G to fulfill its full promise
Enhanced mobile experiences, new capabilities, and expansion to diverse verticals
32. 32
Standardizing AI/ML in cellular communication systems
Broad range of work across standards and industry organizations
Data collection for network performance
enhancements (RAN2/3 — Rel-16/17)
Study on AI/ML functional frameworks and use cases
(RAN3 — Rel-17)
Network data analytic function for core AI/ML use
cases (SA2 — Rel-16/17)
Management data analytic service, autonomous
network (SA5 — R17)
Study on AI/ML model transfer performance
requirements over 5G (SA1 — Rel-17)
Developing AI use cases
Architectural framework for ML
Framework for evaluating intelligence level
Framework for data handling to enable ML
AI for autonomous and assisted driving
Defining reference architecture for Radio Intelligence
Controller (RIC) and interfaces
Developing technical report: “AI/ML Workflow Description
and Requirement”
Developing an AI Mobile Device Requirement Spec
(TS.47)
Focusing on AI mobile phone and tablet (may extend
to IoT/wearable in future releases)
Network automation and autonomy based on AI
Defining requirements and platform recommendations for
reference implementation and interface standards
33. 33
New verticals, deployments,
use cases, spectrum
Longer-term evolution to
deliver on the 5G vision
Unified, future-proof platform
1. 3GPP start date indicates approval of study package (study item->work item->specifications), previous release continues beyond start of next release with functional freezes and ASN.1
Rel-15
Driving the 5G technology evolution in the new decade
Rel-171
Rel-161
2018 2020
2019 2022
2021 2025
2023 2024
Rel-191
Rel-17 continued expansion
• Lower complexity NR-Light
• Higher precision positioning
• Improved IIoT, V2X, IAB, and more…
Rel-15 eMBB focus
• 5G NR foundation
• Smartphones, FWA, PC
• Expanding to venues,
enterprises
Rel-16 industry expansion
• eURLLC and TSN for IIoT
• NR in unlicensed
• 5G V2X sidelink multicast
• In-band eMTC/NB-IoT
• Positioning
Rel-18+ 5G-Advanced
• Next set of 5G releases
(i.e., 18, 19, 20, …)
• Potential projects in discussions
• Rel-18 expected to start in 2022
2026 2027+
Rel-20+ evolution
Rel-181
34. 34
Data collection for network performance enhancements
Part of 3GPP Release 16
1 NWDAF — Network Data Analytics Function; 2 Network Function, Application Function, Operations Administration and Maintenance; 3 For standalone and dual connected 5G NR systems
Enhanced Network Automation (eNA)
New enhanced core network function for
data collection and exposure
Expanding NWDAF1 from providing network slice
analysis in Rel-15 to data collection and exposure
from/to 5G core NF, AF, OAM2, data repositories
Data
repositories
Delivery of
analytics data
Delivery of
activity data
and local
analytics
5GC NF, AF
5GC NF, AF
NWDAF
OAM
Minimization of Drive Testing (MDT)
Logged and immediate MDT, mobility
history information, accessibility and L2
measurements3
Specifying features for identified use cases, including
coverage, optimization, QoS verification, location
information reporting, sensor data collection
Measurements, e.g., average
delay, sensor information
Self Organizing Network (SON)
Mobility robust optimization (MRO),
mobility load balancing (MLB), and
RACH optimization
Specifying device reporting needed to enhance network
configurations and inter-node information exchange
(e.g., enhancements to interfaces like N2, Xn)
35. 35
Enhancements for 5G network interfaces
Facilitating machine learning procedures such as model training and
inference, as well as actions to enforce model inference output
Augmented network and device data collection
Supporting targeted applications (e.g., energy saving,
load balancing, mobility management), operations enhancements,
expanded use case1
Support for over-the-top AI/ML services
Introducing new QoS (Quality of Service) definitions that are tailored for
machine learning model delivery over 5G
Expanding 5G system support
for wireless machine learning
Part of 3GPP Release 17
1 Such as multicast, broadcast, V2X, sidelink, multi-SIM, RAN slicing, and more
36. 36
5G Advanced (Rel-18+) targets to expand wireless
machine learning to the end-to-end system across
RAN, device, and air interface
AI/ML procedure
enhancements
Optimizing system for model
management, training (e.g.,
federated and reinforcement
learning), and inference
Data management
enhancements
Standardizing ML data storage/
access, data registration/
discovery, and data request/
subscription
New and expanded
use cases
Supporting traffic/mobility
prediction, coverage/capacity
optimization, massive MIMO,
SON, CSI feedback, beam
management, and other PHY/MAC
and upper layer improvements
Network architecture
enhancements
Allowing for machine
learning to run over different
HW/SW and future RAN
function split to improve
flexibility and efficiency
37. 37
AI-native air interface design can enable continual
system improvements in between major 3GPP releases
through self-learning
Machine learning can bring
continuous wireless enhancements
No standardized improvement during nominal Work/Study Item
phase towards subsequent release
Approximately 1.5-year cycle
Data-driven system configuration provides
end-to-end optimizations
Dynamic parameter adaptation based on
fast machine learning algorithms
Neural network system design can
customize to given wireless environment
Data-driven communication
and network design
Release X Release X+1
38. 38
Machine learning
is a key technology
vector on the path to 6G
New radio designs
Waveform/coding for MHz to THz, intelligent
surfaces, joint comms. and sensing, large-scale
MIMO, advanced duplexing, energy-efficient RF
Merging of worlds
Physical, digital, virtual, e.g., ubiquitous,
low-power sensing/monitoring, immersive
interactions taking human augmentation to
next level
Wireless AI/ML
Data-driven communication and network design,
with joint training, model sharing and distributed
interference across networks and devices
Scalable network architecture
Disaggregation and virtualization from
cloud-to-edge, and use of advanced relay/mesh
topologies to address growing demand
Coordinated spectrum sharing
New paradigms for more efficient use of
spectrum, leveraging location /environmental
awareness for dynamic/adaptive coordination
Communications resiliency
End-to-end configurable security, post
quantum security, robust networks tolerant
to failures and attacks
39. 39
E2E approach with machine learning to improve 5G system performance and efficiency
Transition to ML data-driven air interface design and operation
Link parameter prediction
Multi-cell interference learning
Mobility parameter prediction
Predictive beam management
Channel state measurements
Device positioning
Fully autonomous networks
Interference coordination/scheduling
Mobility handoff decisions
Joint sensing-communications
Dynamic ML model adaptation
Personalized lifelong learning
Network Devices
Data-driven
propagation models
Data-driven optimization
of signaling, measurements,
and feedback
Air interface
Neural network air interface design for
coding, waveform, and multiple access
Dynamic air interface
operation and adaptation
Joint training, model sharing, and distributed
inference across network & devices
Longer
term
R&D
direction
40. 40
40
ML-based channel feedback
Channel estimation & pilot optimization
MIMO detection
Link prediction & adaptation
Beam management and optimization
Spectrum sensing and sharing
Radio resource scheduling
Digital front-end optimization
Antenna and RF optimization
Full duplex
Battery saving
Reflective intelligent surface
Signal intelligence,
baseband and
medium access
Device intelligence
and optimization
Network intelligence
and system optimization
5G+AI
Our AI research areas to advance
wireless communication
High-precision positioning
Environmental sensing
Contextual awareness
Sensor fusion
Vehicular communication
Vertical intelligence
and other capabilities
Coverage and capacity optimization
Traffic and mobility prediction
Energy saving
Cooperative edge caching
Content-aware X-layer optimization
Enhanced personalized security
TCP optimization
42. 42
Consistent AI R&D
investment is the foundation for product leadership
1st Gen Qualcomm® AI Engine
(Snapdragon® 820 Mobile Platform)
MWC demo showcasing
photo sorting and
handwriting recognition
2nd Gen
Qualcomm AI Engine
(Snapdragon 835)
3rd Gen
Qualcomm AI Engine
(Snapdragon 845)
Brain Corp
raises $114M
Announced
Facebook Caffe2
support
Collaboration
with Google on
TensorFlow
Opened Qualcomm
Research Netherlands
Research face
detection with
deep learning
Research artificial neural
processing architectures
Research in spiking
neural networks
Deep-learning
based AlexNet wins ImageNet
competition
Qualcomm
Technologies ships
ONNX supported
by Microsoft,
Facebook, Amazon
4th Gen Qualcomm
AI Engine
(Snapdragon 855)
Qualcomm
Artificial Intelligence
Research initiated
Snapdragon
710
Qualcomm®
Cloud AI 100
Qualcomm® QCS400
(First audio SoC)
Mobile AI Enablement Center
in Taiwan
Power efficiency
gains through compression,
quantization,
and compilation
Gauge
equivariant
CNNs
Qualcomm® Robotics
RB5 Platform
Our AI leadership
Over a decade of cutting-edge AI R&D, speeding up commercialization and enabling scale
Qualcomm®
Vision
Intelligence
Platform
Qualcomm®
Neural
Processing
SDK
Snapdragon 660
Snapdragon 630
3rd Gen
Snapdragon
Automotive
Cockpit
Snapdragon
665, 730,
730G
5th Gen
Qualcomm AI Engine
(Snapdragon 865)
2018
2016
2015
2009 2013 2017 2019 2020
Qualcomm Research
initiates first AI project
2007
Acquired
EuVision
Completed Brain Corp
joint research
Investment and
collaboration with
Brain Corp
Acquired
Scyfer
Qualcomm
Technologies
researchers
win best paper
at ICLR
Opened joint
research lab
with University
of Amsterdam
AI Model Efficiency
Toolkit (AIMET)
open sourced
Qualcomm Artificial Intelligence Research is an initiative of Qualcomm Technologies, Inc.
Snapdragon, Qualcomm Neural Processing SDK, Qualcomm Vision Intelligence Platform,
Qualcomm AI Engine, Qualcomm Cloud AI, Snapdragon Ride, Qualcomm Robotics RB3
Platform, and Qualcomm QCS400I are products of Qualcomm Technologies, Inc. and/or
its subsidiaries.
AIMET and AIMET Model Zoo are products of Qualcomm Innovation Center, Inc.
2021
6th Gen
Qualcomm AI Engine
(Snapdragon 888)
Snapdragon Ride™
Platform
AIMET
Model Zoo
open sourced
43. 43
Industry leading
AI use cases
Super movie with Tetras.AI
Snapchat lenses accelration
NLP with Hugging Face
Skin condition detection with trinamiX
TVM Opensource
More efficient coding
Qualcomm AI
Engine direct
Easier and faster access
to the entire AI Engine
Qualcomm Neural
Processing SDK & AI
Model Efficiency Toolkit
New features and
improvements
2nd gen Qualcomm®
Sensing Hub
Dedicated AI accelerator
First to support TensorFlow Micro
Qualcomm® Hexagon™
780 Processor
Fused AI Accelerators
• Tensor - 2X compute capacity
• Scalar - 50% performance improvement
• Vector – Support for additional data types
3X performance per watt improvement
16X dedicated memory
Up to 1000X hand off time improvement
in certain use cases
6th gen Qualcomm
AI Engine
26 TOPS
Highlights
AI
*Compared to previous generations
Qualcomm Sensing Hub and Qualcomm Hexagon are products of Qualcomm Technologies, Inc. and/or its subsidiaries.
45. 45
Automotive
IoT/IIoT
Mobile
Cloud
Power efficiency Efficient learning
Personalization
Action
Reinforcement learning
for decision making
Perception
Object detection, speech
recognition, contextual fusion
Reasoning
Scene understanding, language
understanding, behavior prediction
Advancing AI
research to make
efficient AI ubiquitous
A platform to scale AI
across the industry
Edge cloud
Model design,
compression, quantization,
algorithms, efficient
hardware, software tool
Continuous learning,
contextual, always-on,
privacy-preserved,
distributed learning
Robust learning
through minimal data,
unsupervised learning,
on-device learning
46. 46
46
AIMET and AIMET Model Zoo are products of Qualcomm Innovation Center, Inc.
Leading AI research and fast commercialization
Driving the industry towards integer inference and power-efficient AI
AI Model Efficiency Toolkit (AIMET)
AIMET Model Zoo
Relaxed Quantization
(ICLR 2019)
Quantization
research
Quantization
open-sourcing
Data-free Quantization
(ICCV 2019)
AdaRound
(ICML 2020)
Bayesian Bits
(NeurIPS 2020)
Transformer Quantization
(EMNLP 2021)
47. 47
47
AIMET makes AI models small
Open-sourced GitHub project that includes state-of-the-art quantization
and compression techniques from Qualcomm AI Research
If interested, please join the AIMET GitHub project: https://github.com/quic/aimet
Trained
AI model
AI Model Efficiency
Toolkit
(AIMET)
Optimized
AI model
TensorFlow or PyTorch
Compression
Quantization
Deployed
AI model
Features:
State-of-the-art
network compression
tools
State-of-the-art
quantization
tools
Support for both
TensorFlow
and PyTorch
Benchmarks
and tests for
many models
Developed by
professional software
developers
48. 48
Benefits
Lower memory
bandwidth
Lower
power
Lower
storage
Higher
performance
Maintains model
accuracy
Simple
ease of use
AIMET
Providing advanced
model efficiency
features and benefits
Features
Quantization
Compression
State-of-the-art INT8 and
INT4 performance
Post-training quantization methods,
including Data-Free Quantization
and Adaptive Rounding (AdaRound) —
coming soon
Quantization-aware training
Quantization simulation
Efficient tensor decomposition
and removal of redundant
channels in convolution layers
Spatial singular value
decomposition (SVD)
Channel pruning
Visualization
Analysis tools for drawing insights
for quantization and compression
Weight ranges
Per-layer compression sensitivity
49. 49
AIMET
Model Zoo
Accurate pre-trained 8-bit
quantized models
Image
classification
Semantic
segmentation
Pose
estimation
Speech
recognition
Super
resolution
Object
detection
50. 50
50
The essential role of AI in the 5G future
How machine learning is accelerating wireless innovations in the new decade and beyond
5G and AI are two synergistic,
essential ingredients that are
fueling future innovations
Applying AI techniques to solve
difficult wireless challenges and
deliver new values
Machine learning plays an
expanding role in the evolution of
5G towards 6G