On last March 21, 2020, we participated in worldwide Arduino Day 2020 and organized the online event for Bandung, Indonesia. This is the deck I delivered for my talk and demo.
TinyML: Machine Learning for MicrocontrollersRobert John
My presentation at TensorFlow User Groups Sub-Saharan Africa Summit discusses machine learning for embedded devices, the importance, and the challenges.
Introduction to TinyML - Solomon Muhunyo GithuSolomon Githu
What is TinyML? Why so tiny?
This presentation simplifies what TinyML is about. It describes the importance of this technology and also shares some basic guidelines on how it is implemented.
Artificial Intelligence Machine Learning Deep Learning Ppt Powerpoint Present...SlideTeam
Choose our Artificial Intelligence Machine Learning Deep Learning PPT PowerPoint Presentation Slide Templates to understand this popular branch of computer science. Acquaint your audience with the process of building smart, capable machines that can perform intelligent tasks with the help of this neural network PPT presentation. Exhibit the difference between AI, machine learning, and deep learning through this informative robotics PPT design. Elaborate on the wide range of areas that can benefit from artificial intelligence like supply chain, customer experience, human resources, fraud detection, research, and development by taking the aid of this computer science PPT slideshow. Highlight the booming rate of AI business and its future revenue forecast by downloading this thought-provoking and indulging information technology PowerPoint graphics. Save your time and efforts with these pre-ready and professionally crafted content-specific slides. It will educate your audience about this complex process in an easy yet efficient way. Download this AI functioning PowerPoint deck to create a roadmap for the growth and expansion of your business. https://bit.ly/3x135nD
Presenting the landscape of AI/ML in 2023 by introducing a quick summary of the last 10 years of its progress, current situation, and looking at things happening behind the scene.
This presentation discusses matters of AI and machine learning. This presentation was given during the ITU-T workshop on Machine Learning for 5G and beyond, held at ITU HQ in Geneva, Switzerland on 29 Jan 18. More information on the workshop can be found here: https://www.itu.int/en/ITU-T/Workshops-and-Seminars/20180129/Pages/default.aspx
Join our upcoming forums and workshops here: https://www.itu.int/en/ITU-T/Workshops-and-Seminars/Pages/default.aspx
TinyML: Machine Learning for MicrocontrollersRobert John
My presentation at TensorFlow User Groups Sub-Saharan Africa Summit discusses machine learning for embedded devices, the importance, and the challenges.
Introduction to TinyML - Solomon Muhunyo GithuSolomon Githu
What is TinyML? Why so tiny?
This presentation simplifies what TinyML is about. It describes the importance of this technology and also shares some basic guidelines on how it is implemented.
Artificial Intelligence Machine Learning Deep Learning Ppt Powerpoint Present...SlideTeam
Choose our Artificial Intelligence Machine Learning Deep Learning PPT PowerPoint Presentation Slide Templates to understand this popular branch of computer science. Acquaint your audience with the process of building smart, capable machines that can perform intelligent tasks with the help of this neural network PPT presentation. Exhibit the difference between AI, machine learning, and deep learning through this informative robotics PPT design. Elaborate on the wide range of areas that can benefit from artificial intelligence like supply chain, customer experience, human resources, fraud detection, research, and development by taking the aid of this computer science PPT slideshow. Highlight the booming rate of AI business and its future revenue forecast by downloading this thought-provoking and indulging information technology PowerPoint graphics. Save your time and efforts with these pre-ready and professionally crafted content-specific slides. It will educate your audience about this complex process in an easy yet efficient way. Download this AI functioning PowerPoint deck to create a roadmap for the growth and expansion of your business. https://bit.ly/3x135nD
Presenting the landscape of AI/ML in 2023 by introducing a quick summary of the last 10 years of its progress, current situation, and looking at things happening behind the scene.
This presentation discusses matters of AI and machine learning. This presentation was given during the ITU-T workshop on Machine Learning for 5G and beyond, held at ITU HQ in Geneva, Switzerland on 29 Jan 18. More information on the workshop can be found here: https://www.itu.int/en/ITU-T/Workshops-and-Seminars/20180129/Pages/default.aspx
Join our upcoming forums and workshops here: https://www.itu.int/en/ITU-T/Workshops-and-Seminars/Pages/default.aspx
AI Vs ML Vs DL PowerPoint Presentation Slide Templates Complete DeckSlideTeam
AI Vs ML Vs DL PowerPoint Presentation Slide Templates Complete Deck is loaded with easy-to-follow content, and intuitive design. Introduce the types and levels of artificial intelligence using the highly-effective visuals featured in this PPT slide deck. Showcase the AI-subfield of machine learning, as well as deep learning through our comprehensive PowerPoint theme. Represent the differences, and interrelationship between AI, ML, and DL. Elaborate on the scope and use case of machine intelligence in healthcare, HR, banking, supply chain, or any other industry. Take advantage of the infographic-style layout to describe why AI is flourishing in today’s day and age. Elucidate AI trends such as robotic process automation, advanced cybersecurity, AI-powered chatbots, and more. Cover all the essentials of machine learning and deep learning with the help of this PPT slideshow. Outline the application, algorithms, use cases, significance, and selection criteria for machine learning. Highlight the deep learning process, types, limitations, and significance. Describe reinforcement training, neural network classifications, and a lot more. Hit download and begin personalization. Our AI Vs ML Vs DL PowerPoint Presentation Slide Templates Complete Deck are topically designed to provide an attractive backdrop to any subject. Use them to look like a presentation pro. https://bit.ly/3ngJCKf
This presentation introduces to the world of hardware everyone can use to get stated with Internet of Things (IoT) such as Arduino, Raspberry Pi and ESP8266.
It’s long ago, approx. 30 years, since AI was not only a topic for Science-Fiction writers, but also a major research field surrounded with huge hopes and investments. But the over-inflated expectations ended in a subsequent crash and followed by a period of absent funding and interest – the so-called AI winter. However, the last 3 years changed everything – again. Deep learning, a machine learning technique inspired by the human brain, successfully crushed one benchmark after another and tech companies, like Google, Facebook and Microsoft, started to invest billions in AI research. “The pace of progress in artificial general intelligence is incredible fast” (Elon Musk – CEO Tesla & SpaceX) leading to an AI that “would be either the best or the worst thing ever to happen to humanity” (Stephen Hawking – Physicist).
What sparked this new Hype? How is Deep Learning different from previous approaches? Are the advancing AI technologies really a threat for humanity? Let’s look behind the curtain and unravel the reality. This talk will explore why Sundar Pichai (CEO Google) recently announced that “machine learning is a core transformative way by which Google is rethinking everything they are doing” and explain why "Deep Learning is probably one of the most exciting things that is happening in the computer industry” (Jen-Hsun Huang – CEO NVIDIA).
Either a new AI “winter is coming” (Ned Stark – House Stark) or this new wave of innovation might turn out as the “last invention humans ever need to make” (Nick Bostrom – AI Philosoph). Or maybe it’s just another great technology helping humans to achieve more.
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...Simplilearn
This Deep Learning Presentation will help you in understanding what is Deep learning, why do we need Deep learning, applications of Deep Learning along with a detailed explanation on Neural Networks and how these Neural Networks work. Deep learning is inspired by the integral function of the human brain specific to artificial neural networks. These networks, which represent the decision-making process of the brain, use complex algorithms that process data in a non-linear way, learning in an unsupervised manner to make choices based on the input. This Deep Learning tutorial is ideal for professionals with beginners to intermediate levels of experience. Now, let us dive deep into this topic and understand what Deep learning actually is.
Below topics are explained in this Deep Learning Presentation:
1. What is Deep Learning?
2. Why do we need Deep Learning?
3. Applications of Deep Learning
4. What is Neural Network?
5. Activation Functions
6. Working of Neural Network
Simplilearn’s Deep Learning course will transform you into an expert in deep learning techniques using TensorFlow, the open-source software library designed to conduct machine learning & deep neural network research. With our deep learning course, you’ll master deep learning and TensorFlow concepts, learn to implement algorithms, build artificial neural networks and traverse layers of data abstraction to understand the power of data and prepare you for your new role as deep learning scientist.
Why Deep Learning?
It is one of the most popular software platforms used for deep learning and contains powerful tools to help you build and implement artificial neural networks.
Advancements in deep learning are being seen in smartphone applications, creating efficiencies in the power grid, driving advancements in healthcare, improving agricultural yields, and helping us find solutions to climate change. With this Tensorflow course, you’ll build expertise in deep learning models, learn to operate TensorFlow to manage neural networks and interpret the results.
You can gain in-depth knowledge of Deep Learning by taking our Deep Learning certification training course. With Simplilearn’s Deep Learning course, you will prepare for a career as a Deep Learning engineer as you master concepts and techniques including supervised and unsupervised learning, mathematical and heuristic aspects, and hands-on modeling to develop algorithms.
There is booming demand for skilled deep learning engineers across a wide range of industries, making this deep learning course with TensorFlow training well-suited for professionals at the intermediate to advanced level of experience. We recommend this deep learning online course particularly for the following professionals:
1. Software engineers
2. Data scientists
3. Data analysts
4. Statisticians with an interest in deep learning
An introduction to AI (artificial intelligence)Bellaj Badr
An introduction to AI (artificial intelligence)
The ppt link is available bellow https://docs.google.com/presentation/d/1-oaO75DEdP259HNrrvh5fbZVOtaiiiffi3luyv0tShw/edit?usp=sharing
you could leave your comments on google slides
Deep learning (also known as deep structured learning or hierarchical learning) is the application of artificial neural networks (ANNs) to learning tasks that contain more than one hidden layer. Deep learning is part of a broader family of machine learning methods based on learning data representations, as opposed to task-specific algorithms. Learning can be supervised, partially supervised or unsupervised.
3 Things to Learn About:
*The IoT ecosystem and data management considerations for IoT
*Top IoT use cases and data architecture strategies for managing the sheer volume and variety of IoT data
*Real-life case studies on how our customers are using Cloudera Enterprise to drive insights and analytics from all of their IoT data
The slides defines IoT and show the differnce between M2M and IoT vision. It then describes the different layers that depicts the functional architecture of IoT, standard organizations and bodies and other IoT technology alliances, low power IoT protocols, IoT Platform components, and finally gives a short description to one of IoT low power application protocols (MQTT).
A comprehensive tutorial on Convolutional Neural Networks (CNN) which talks about the motivation behind CNNs and Deep Learning in general, followed by a description of the various components involved in a typical CNN layer. It explains the theory involved with the different variants used in practice and also, gives a big picture of the whole network by putting everything together.
Next, there's a discussion of the various state-of-the-art frameworks being used to implement CNNs to tackle real-world classification and regression problems.
Finally, the implementation of the CNNs is demonstrated by implementing the paper 'Age ang Gender Classification Using Convolutional Neural Networks' by Hassner (2015).
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2023/06/accelerating-newer-ml-models-using-the-qualcomm-ai-stack-a-presentation-from-qualcomm/
Vinesh Sukumar, Senior Director and Head of AI/ML Product Management at Qualcomm Technologies, presents the “Accelerating Newer ML Models Using the Qualcomm AI Stack” tutorial at the May 2023 Embedded Vision Summit.
The Qualcomm AI Stack revolutionizes how Qualcomm thinks about AI software and provides the ultimate tool and user interface to enable ecosystem partners to create faster and smarter AI applications for all embedded form factors. Focusing on real user experience challenges centered around model deployment, Sakumar explains how the Snapdragon developer community leverages data types, quantization and neural architecture search—among others—to optimize complex AI architectures for emerging use cases.
https://telecombcn-dl.github.io/2017-dlcv/
Deep learning technologies are at the core of the current revolution in artificial intelligence for multimedia data analysis. The convergence of large-scale annotated datasets and affordable GPU hardware has allowed the training of neural networks for data analysis tasks which were previously addressed with hand-crafted features. Architectures such as convolutional neural networks, recurrent neural networks and Q-nets for reinforcement learning have shaped a brand new scenario in signal processing. This course will cover the basic principles and applications of deep learning to computer vision problems, such as image classification, object detection or image captioning.
Congresso Sociedade Brasileira de Computação CSBC2016 Porto Alegre (Brazil)
Workshop on Cloud Networks & Cloudscape Brazil
Sergio Takeo Kofuji, Assistant Professor at the University of São Paulo, Coordinator to FI WARE LAB in University of São Paulo, Brazil
The European Commission, in a recent communication (April 19th), has identified 5G and Internet of Things (IoT) amongst the ICT standardisation priorities for the Digital Single Market (DSM). This session will discuss the emergence of the mobile edge computing paradigm to reduce the latency for processing near the source large quantities of data and the need of the emerging 5G technology to satisfy the requirements of different verticals. Mobile Edge Clouds have the potential to provide an enormous amount of resources, but it raises several research challenges related to the resilience, security, data portability and usage due to the presence of multiple trusted domains, as well as energy consumption of battery powered devices. Large and centralized clouds have been deployed and have shown how this paradigm can greatly improve performance and flexibility while reducing costs. However, there are many issues requiring solutions that are user and context aware, dynamic, and with the capability to handle heterogeneous demands and systems. This is a challenge triggered by the Internet of Things (IoT) scenario, which strongly requires cloud-based solutions that can be dynamically located and managed, on demand and with self-organization capabilities to serve the purposes of different verticals.
Part 1 of the Deep Learning Fundamentals Series, this session discusses the use cases and scenarios surrounding Deep Learning and AI; reviews the fundamentals of artificial neural networks (ANNs) and perceptrons; discuss the basics around optimization beginning with the cost function, gradient descent, and backpropagation; and activation functions (including Sigmoid, TanH, and ReLU). The demos included in these slides are running on Keras with TensorFlow backend on Databricks.
AI Vs ML Vs DL PowerPoint Presentation Slide Templates Complete DeckSlideTeam
AI Vs ML Vs DL PowerPoint Presentation Slide Templates Complete Deck is loaded with easy-to-follow content, and intuitive design. Introduce the types and levels of artificial intelligence using the highly-effective visuals featured in this PPT slide deck. Showcase the AI-subfield of machine learning, as well as deep learning through our comprehensive PowerPoint theme. Represent the differences, and interrelationship between AI, ML, and DL. Elaborate on the scope and use case of machine intelligence in healthcare, HR, banking, supply chain, or any other industry. Take advantage of the infographic-style layout to describe why AI is flourishing in today’s day and age. Elucidate AI trends such as robotic process automation, advanced cybersecurity, AI-powered chatbots, and more. Cover all the essentials of machine learning and deep learning with the help of this PPT slideshow. Outline the application, algorithms, use cases, significance, and selection criteria for machine learning. Highlight the deep learning process, types, limitations, and significance. Describe reinforcement training, neural network classifications, and a lot more. Hit download and begin personalization. Our AI Vs ML Vs DL PowerPoint Presentation Slide Templates Complete Deck are topically designed to provide an attractive backdrop to any subject. Use them to look like a presentation pro. https://bit.ly/3ngJCKf
This presentation introduces to the world of hardware everyone can use to get stated with Internet of Things (IoT) such as Arduino, Raspberry Pi and ESP8266.
It’s long ago, approx. 30 years, since AI was not only a topic for Science-Fiction writers, but also a major research field surrounded with huge hopes and investments. But the over-inflated expectations ended in a subsequent crash and followed by a period of absent funding and interest – the so-called AI winter. However, the last 3 years changed everything – again. Deep learning, a machine learning technique inspired by the human brain, successfully crushed one benchmark after another and tech companies, like Google, Facebook and Microsoft, started to invest billions in AI research. “The pace of progress in artificial general intelligence is incredible fast” (Elon Musk – CEO Tesla & SpaceX) leading to an AI that “would be either the best or the worst thing ever to happen to humanity” (Stephen Hawking – Physicist).
What sparked this new Hype? How is Deep Learning different from previous approaches? Are the advancing AI technologies really a threat for humanity? Let’s look behind the curtain and unravel the reality. This talk will explore why Sundar Pichai (CEO Google) recently announced that “machine learning is a core transformative way by which Google is rethinking everything they are doing” and explain why "Deep Learning is probably one of the most exciting things that is happening in the computer industry” (Jen-Hsun Huang – CEO NVIDIA).
Either a new AI “winter is coming” (Ned Stark – House Stark) or this new wave of innovation might turn out as the “last invention humans ever need to make” (Nick Bostrom – AI Philosoph). Or maybe it’s just another great technology helping humans to achieve more.
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...Simplilearn
This Deep Learning Presentation will help you in understanding what is Deep learning, why do we need Deep learning, applications of Deep Learning along with a detailed explanation on Neural Networks and how these Neural Networks work. Deep learning is inspired by the integral function of the human brain specific to artificial neural networks. These networks, which represent the decision-making process of the brain, use complex algorithms that process data in a non-linear way, learning in an unsupervised manner to make choices based on the input. This Deep Learning tutorial is ideal for professionals with beginners to intermediate levels of experience. Now, let us dive deep into this topic and understand what Deep learning actually is.
Below topics are explained in this Deep Learning Presentation:
1. What is Deep Learning?
2. Why do we need Deep Learning?
3. Applications of Deep Learning
4. What is Neural Network?
5. Activation Functions
6. Working of Neural Network
Simplilearn’s Deep Learning course will transform you into an expert in deep learning techniques using TensorFlow, the open-source software library designed to conduct machine learning & deep neural network research. With our deep learning course, you’ll master deep learning and TensorFlow concepts, learn to implement algorithms, build artificial neural networks and traverse layers of data abstraction to understand the power of data and prepare you for your new role as deep learning scientist.
Why Deep Learning?
It is one of the most popular software platforms used for deep learning and contains powerful tools to help you build and implement artificial neural networks.
Advancements in deep learning are being seen in smartphone applications, creating efficiencies in the power grid, driving advancements in healthcare, improving agricultural yields, and helping us find solutions to climate change. With this Tensorflow course, you’ll build expertise in deep learning models, learn to operate TensorFlow to manage neural networks and interpret the results.
You can gain in-depth knowledge of Deep Learning by taking our Deep Learning certification training course. With Simplilearn’s Deep Learning course, you will prepare for a career as a Deep Learning engineer as you master concepts and techniques including supervised and unsupervised learning, mathematical and heuristic aspects, and hands-on modeling to develop algorithms.
There is booming demand for skilled deep learning engineers across a wide range of industries, making this deep learning course with TensorFlow training well-suited for professionals at the intermediate to advanced level of experience. We recommend this deep learning online course particularly for the following professionals:
1. Software engineers
2. Data scientists
3. Data analysts
4. Statisticians with an interest in deep learning
An introduction to AI (artificial intelligence)Bellaj Badr
An introduction to AI (artificial intelligence)
The ppt link is available bellow https://docs.google.com/presentation/d/1-oaO75DEdP259HNrrvh5fbZVOtaiiiffi3luyv0tShw/edit?usp=sharing
you could leave your comments on google slides
Deep learning (also known as deep structured learning or hierarchical learning) is the application of artificial neural networks (ANNs) to learning tasks that contain more than one hidden layer. Deep learning is part of a broader family of machine learning methods based on learning data representations, as opposed to task-specific algorithms. Learning can be supervised, partially supervised or unsupervised.
3 Things to Learn About:
*The IoT ecosystem and data management considerations for IoT
*Top IoT use cases and data architecture strategies for managing the sheer volume and variety of IoT data
*Real-life case studies on how our customers are using Cloudera Enterprise to drive insights and analytics from all of their IoT data
The slides defines IoT and show the differnce between M2M and IoT vision. It then describes the different layers that depicts the functional architecture of IoT, standard organizations and bodies and other IoT technology alliances, low power IoT protocols, IoT Platform components, and finally gives a short description to one of IoT low power application protocols (MQTT).
A comprehensive tutorial on Convolutional Neural Networks (CNN) which talks about the motivation behind CNNs and Deep Learning in general, followed by a description of the various components involved in a typical CNN layer. It explains the theory involved with the different variants used in practice and also, gives a big picture of the whole network by putting everything together.
Next, there's a discussion of the various state-of-the-art frameworks being used to implement CNNs to tackle real-world classification and regression problems.
Finally, the implementation of the CNNs is demonstrated by implementing the paper 'Age ang Gender Classification Using Convolutional Neural Networks' by Hassner (2015).
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2023/06/accelerating-newer-ml-models-using-the-qualcomm-ai-stack-a-presentation-from-qualcomm/
Vinesh Sukumar, Senior Director and Head of AI/ML Product Management at Qualcomm Technologies, presents the “Accelerating Newer ML Models Using the Qualcomm AI Stack” tutorial at the May 2023 Embedded Vision Summit.
The Qualcomm AI Stack revolutionizes how Qualcomm thinks about AI software and provides the ultimate tool and user interface to enable ecosystem partners to create faster and smarter AI applications for all embedded form factors. Focusing on real user experience challenges centered around model deployment, Sakumar explains how the Snapdragon developer community leverages data types, quantization and neural architecture search—among others—to optimize complex AI architectures for emerging use cases.
https://telecombcn-dl.github.io/2017-dlcv/
Deep learning technologies are at the core of the current revolution in artificial intelligence for multimedia data analysis. The convergence of large-scale annotated datasets and affordable GPU hardware has allowed the training of neural networks for data analysis tasks which were previously addressed with hand-crafted features. Architectures such as convolutional neural networks, recurrent neural networks and Q-nets for reinforcement learning have shaped a brand new scenario in signal processing. This course will cover the basic principles and applications of deep learning to computer vision problems, such as image classification, object detection or image captioning.
Congresso Sociedade Brasileira de Computação CSBC2016 Porto Alegre (Brazil)
Workshop on Cloud Networks & Cloudscape Brazil
Sergio Takeo Kofuji, Assistant Professor at the University of São Paulo, Coordinator to FI WARE LAB in University of São Paulo, Brazil
The European Commission, in a recent communication (April 19th), has identified 5G and Internet of Things (IoT) amongst the ICT standardisation priorities for the Digital Single Market (DSM). This session will discuss the emergence of the mobile edge computing paradigm to reduce the latency for processing near the source large quantities of data and the need of the emerging 5G technology to satisfy the requirements of different verticals. Mobile Edge Clouds have the potential to provide an enormous amount of resources, but it raises several research challenges related to the resilience, security, data portability and usage due to the presence of multiple trusted domains, as well as energy consumption of battery powered devices. Large and centralized clouds have been deployed and have shown how this paradigm can greatly improve performance and flexibility while reducing costs. However, there are many issues requiring solutions that are user and context aware, dynamic, and with the capability to handle heterogeneous demands and systems. This is a challenge triggered by the Internet of Things (IoT) scenario, which strongly requires cloud-based solutions that can be dynamically located and managed, on demand and with self-organization capabilities to serve the purposes of different verticals.
Part 1 of the Deep Learning Fundamentals Series, this session discusses the use cases and scenarios surrounding Deep Learning and AI; reviews the fundamentals of artificial neural networks (ANNs) and perceptrons; discuss the basics around optimization beginning with the cost function, gradient descent, and backpropagation; and activation functions (including Sigmoid, TanH, and ReLU). The demos included in these slides are running on Keras with TensorFlow backend on Databricks.
A late upload. This slide was presented on Aug 31, 2019, when I delivered a talk for AIoT seminar in University of Lambung Mangkurat, Banjarbaru. It's part of Republic of IoT 2019 event.
(See: http://youtu.be/9rP-5TSk_dA)
Electronic Systems support every aspect of our lives today, both Visibly and Invisibly. Numbered in their tens of billions these are the dominant form of computing we now experience. And whilst many dissipate just milliwatts, their shear volume makes them a significant consumer of energy in their own right. Energy Efficiency in Computing has moved from the mainframe to become a consumer issue.
## By Ian Phillips http://ianp24.blogspot.co.uk/
## Opinions expressed are my own
Moved from SlideShare 10mar14 with 1064 views)
The prevalence of computers in form of so called "smart" devices embedded in our everyday environment is inevitable. From pentester's perspective, the adjective "smart" at first glance can hardly be used to describe their inventors and ambassadors.
Based on a few examples (i.a. BTLE beacons, smart meters, security cameras...) I will show how easily "smart" devices can be outsmarted. Sometimes you don't even need any 'hacking' skills, or the default configuration is wide-open. But are we doomed? What are the conditions for real threat? Can the vulnerabilities be exploited anonymously and as easily as in web application? Where is the physical border the intruder would be likely to cross? The risks involved are usually different, but does it mean we don't have to worry? Are we sure how to use securely the emerging technology?
The (Io)Things you don't even need to hack. Should we worry?SecuRing
The prevalence of computers in form of so called "smart" devices embedded in our everyday environment is inevitable. From pentester's perspective, the adjective "smart" at first glance can hardly be used to describe their inventors and ambassadors.
Based on a few examples (i.a. BTLE beacons, smart meters, security cameras...) I will show how easily "smart" devices can be outsmarted. Sometimes you don't even need any 'hacking' skills, or the default configuration is wide-open. But are we doomed? What are the conditions for real threat? Can the vulnerabilities be exploited anonymously and as easily as in web application? Where is the physical border the intruder would be likely to cross? The risks involved are usually different, but does it mean we don't have to worry? Are we sure how to use securely the emerging technology?
TechInAsia PDC 2019 - Unlocking The Potential of IoT with AIAndri Yadi
A late upload. I was honored to involve and deliver speaking for the 3rd time in TechInAsia - Product Development Conference (PDC). In this PDC 2019, I shared about the potential of IoT to be unlocked with the help of AI.
Global Azure Bootcamp 2019 - AIoT powered by AzureAndri Yadi
It's very late share. The deck I presented when discussing AIoT powered by Microsoft Azure, during Global Azure Bootcamp 2019 - Bandung, Indonesia on April 2019.
Opportunities & Challenges in IoT - Future of IoT industry in Indonesia 2019 ...Andri Yadi
It's a late share. I was honored to represent Indonesia IoT Association to discuss about the future of IoT industry in Indonesia - the opportunities and challenges for years to come. It's during FGD of Development of National IoT Industry 2019-2024.
Microsoft Azure-powered IoT & AI Solution To Help FarmerAndri Yadi
This deck is presented during my speaking in Microsoft's //DevCon / Digital Economy Summit, Jakarta, Feb 27, 2020, which was one of a kind event since it was attended by Mr. Satya Nadella (CEO of Microsoft) and Mr. Joko Widodo (President of Indonesia). I shared about how Azure can power SMARTernak - a livestock-farming assistance platform - to help farmers.
Delivered a talk to discuss developer-perspective technical introduction, stories around LoRa/LoRaWAN, also the state in Indonesia.
Use this deck for a sharing session with Maker4Nation community, back then on Oct 3, 2018 in Jakarta.
I was invited by Indosat Ooredoo, one of 3 biggest telco operator in Indonesia, to share about IoT Development for its Ask The Expert program. This is the deck I use to discuss about tips and tricks, thoughts, and some real-world use cases for IoT development and implementation.
Global Azure Bootcamp 2018 - Azure IoT CentralAndri Yadi
The deck I presented when talking about Azure IoT Central during Global Azure Bootcamp 2018, in Bandung city, Indonesia.
I should have uploaded this on last March 2018. Usual, lot of works. So, some info in this deck may change and some code referred may be deprecated. But the concept still should be relevant.
Maker Movement toward IoT Ecosystem in IndonesiaAndri Yadi
I had an honour to share my thought on Indonesia Maker Movement to a group of Indonesia IoT stakeholders and community. It's during a forum group discussion organized by Indonesia Ministry of Communication and Informatics.
IoT for Agriculture in a Nutshell: Technical PerspectiveAndri Yadi
It's a late upload. I had a chance to share my thought on how IoT can help agriculture, esp precision agriculture. I used this slide for the talk in a Indonesia Ministry of Agriculture's event.
Road to Republic of IoT - IoT Technologies & Machine LearningAndri Yadi
Yep, should have uploaded this on July 2017. To promote Republic of IoT (RIoT) hackathon, we do roadshow to few cities in Indonesia and this time in Semarang city. Here, I talked about technologies will be used during hackathon, especially LoRa, ESP32, and machine learning.
IoT Connectivity: The Technical & PotentialAndri Yadi
I had a chance to deliver a talk in Huawei Tech Day 2017 at University of Indonesia. I used this slide to discuss the connectivity options in IoT, from the technical perspective, while also discussed a bit of the potential.
I used this slide to deliver a talk in "Face the Future through IoT" seminar, where I talked technicality behind IoT and delivered a comprehensive demo from the sensor, connectivity, and process with Machine Learning, all on top of Azure.
Global Azure Bootcamp 2017 - Azure IoT Hub with LoRa ConnectivityAndri Yadi
Should have posted 1 year ago. In this Global Azure Bootcamp 2017, I had a chance to share how to connect IoT devices to Azure IoT Hub by leveraging LoRa/LoRaWAN connectivity.
Road to Republic of IoT - ESP32 Programming and LoRaAndri Yadi
To promote Republic of IoT (RIoT) hackathon, we do roadshow to few cities in Indonesia and this time in Bogor. Here, I talked about technologies will be used during hackathon, especially LoRa and ESP32.
I use this keynote to share my view on entrepreneurship, what it takes to be an entrepreneur, which is problem solving mindset through "making" activity.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
1. Intro to TinyML
Technical, implementation with Arduino
Arduino Day 2020 Online Event
Bandung, Indonesia
Mar 21, 2020
Andri Yadi
CEO, DycodeX
Microsoft MVP, Azure
2.
3. Andri Yadi
Co-founder & CEO of DycodeX
Vice Chairman, Indonesia IoT Association (ASIOTI)
Microsoft MVP, Azure
a (at) dycodex.com | andriyadi.com | github.com/andriyadi
Physicist, Developer, Maker, Community Guy, Entrepreneur
About Me
MicrosoJ Most Valuable Professional (MVP) for 12 years
Code for food & passion for 20 years
Break & make electronic stuOs for 22 years
Trying to change the world through entrepreneurship, 15 years now
5. Communication
Networks
IoT High Level Architecture
Gateways /
Base Station
Things
(lots of them)
Internet
Apps
Cloud User
Internet
Ingestion
Infrastructure
Logics
API
9. You oKen see…
10 years baOery life is
GIMMICK!
about LPWA connectivity
10. You oKen see…
Achievable by 2-3 of these combinations:
Using big capacity ba_ery (e.g. 19Ah)
Transmi_ing data few times a day
Payload size 10s-100s bytes ➙ shob tx
10 years baOery life is
GIMMICK!
Notes:
NB-IoT (or other LPWA connectivities) does consume
much lower power than WiFi or 4G/LTE, due to its low
data rate. Only tens to hundreds mA per-tx
about LPWA connectivity
11. So, if you have…
Sensors:
GPS / GNSS
Accelerometer, Gyroscope
Body Temperature
Ambient Temperature & Humidity
Barometric Pressure
Ambient Light
MEMS Microphone
Device Removal Detector
SMARTernak electronics board
12. So, if you have…
Sensors:
GPS / GNSS
Accelerometer, Gyroscope
Body Temperature
Ambient Temperature & Humidity
Barometric Pressure
Ambient Light
MEMS Microphone
Device Removal Detector
You may only transmit:
Current lat, long, speed, direction
Current values or average motion
Current, peak, or lowest body temperature
Current, peak, or avg. temperature & humidity
Current, peak, or avg. Barometric Pressure
Current, peak, or lowest ambient Light
Current, peak, or lowest sound level in db
Detached or not
SMARTernak electronics board
99% of sensor data has to be discarded due to
connectivity’s bandwidth or power constraints
13. Motor failure detection
Dashboard
For that, usually we collect raw data from vibration sensor and transmit to the Cloud.
In the cloud, raw data are processed to classify whether motor is working normal or fault.
On the other hand, consider a use case…
MicrocontrollerMotor
14. Motor failure detection
Vibration
raw data
5D 1B 16
56 01 41 …
(1K Bytes)
Dashboard
For that, usually we collect raw data from vibration sensor and transmit to the Cloud.
In the cloud, raw data are processed to classify whether motor is working normal or fault.
On the other hand, consider a use case…
MicrocontrollerMotor
15. Motor failure detection
Vibration
raw data
5D 1B 16
56 01 41 …
(1K Bytes)
Dashboard
For that, usually we collect raw data from vibration sensor and transmit to the Cloud.
In the cloud, raw data are processed to classify whether motor is working normal or fault.
On the other hand, consider a use case…
MicrocontrollerMotor
Vibration
raw data
5D 1B 16
56 01 41 …
(1K Bytes)
16. Motor failure detection
Vibration
raw data
5D 1B 16
56 01 41 …
(1K Bytes)
Fault
detected
Dashboard
For that, usually we collect raw data from vibration sensor and transmit to the Cloud.
In the cloud, raw data are processed to classify whether motor is working normal or fault.
On the other hand, consider a use case…
MicrocontrollerMotor
Vibration
raw data
5D 1B 16
56 01 41 …
(1K Bytes)
17. Motor failure detection
Vibration
raw data
5D 1B 16
56 01 41 …
(1K Bytes)
Fault
detected
Dashboard
If we try to send 1KB of raw data over LPWA (e.g. LoRa),
it may take “forever”, and consume more power
For that, usually we collect raw data from vibration sensor and transmit to the Cloud.
In the cloud, raw data are processed to classify whether motor is working normal or fault.
On the other hand, consider a use case…
MicrocontrollerMotor
Vibration
raw data
5D 1B 16
56 01 41 …
(1K Bytes)
…it will beat the purpose of low power connectivity
LPWA
18. Motor failure detection
Dashboard
Due to bandwidth and power constraints, we only send sampled and calculated data
So, what we may do, instead
MicrocontrollerMotor
Peak
Peak value
FFT
Vibration
raw data
5D 1B 16
56 01 41 …
(1K Bytes)
Sampled/Calculated
LPWA
19. Motor failure detection
Dashboard
Due to bandwidth and power constraints, we only send sampled and calculated data
So, what we may do, instead
MicrocontrollerMotor
But we may miss lot of interesting events!
Peak
Peak value
FFT
Fault
detected
Vibration
raw data
5D 1B 16
56 01 41 …
(1K Bytes)
Sampled/Calculated
LPWA
20. Motor failure detection
Dashboard
There’s on-device “intelligence” to do complete analysis, right on the device.
Then, only transmit the conclusion (just few bytes of data) to the cloud.
Somehow…
MicrocontrollerMotor
On-device
processing
Vibration
raw data
5D 1B 16
56 01 41 …
(1K Bytes)
LPWA
21. Motor failure detection
Dashboard
There’s on-device “intelligence” to do complete analysis, right on the device.
Then, only transmit the conclusion (just few bytes of data) to the cloud.
Somehow…
MicrocontrollerMotor
Processed data
0A 0B 01 (➙ Normal)
0A 0B 02 (➙ Fault)
On-device
processing
Fault
detected
Vibration
raw data
5D 1B 16
56 01 41 …
(1K Bytes)
LPWA
22. Motor failure detection
Dashboard
There’s on-device “intelligence” to do complete analysis, right on the device.
Then, only transmit the conclusion (just few bytes of data) to the cloud.
Somehow…
MicrocontrollerMotor
That may save power, bandwidth,
and deliver more complete analysis
Processed data
0A 0B 01 (➙ Normal)
0A 0B 02 (➙ Fault)
On-device
processing
Fault
detected
Vibration
raw data
5D 1B 16
56 01 41 …
(1K Bytes)
LPWA
23. Motor failure detection
Dashboard
There’s on-device “intelligence” to do complete analysis, right on the device.
Then, only transmit the conclusion (just few bytes of data) to the cloud.
Back to…
MicrocontrollerMotor
Processed data
0A 0B 01 (➙ Normal)
0A 0B 02 (➙ Fault)
On-device
processing
What is this? Machine Learning?
Fault
detected
That may save power, bandwidth,
and deliver more complete analysis
Vibration
raw data
5D 1B 16
56 01 41 …
(1K Bytes)
LPWA
25. Machine Learning
ClassiYcation
What’s happening right now?
Anomaly Detection
Is it “normal”?
Forecasting
What will happen in the future?
Some ML problems
Credit: h_ps://www.slideshare.net/janjongboom/adding-intelligence-to-lorawan-devices-the-things-conference-2020
26. Machine Learning
ClassiYcation
What’s happening right now?
Anomaly Detection
Is it “normal”?
Forecasting
What will happen in the future?
Some ML problems
For those problems,
ML Inference is possible on
ultra-low power, low cost,
tiny Microcontroller
27. High pelormance:
Less latency as no raw data need to go to cloud/backend
Still works omine:
Machine Learning inference happens right on-device.
No internet connectivity needed to get inference result from cloud.
Encient power and bandwidth consumption:
Only transmit necessary inference result, which are small in data size.
Possible to use Low Power Wide Area (LPWA) connectivity, e.g: LoRa or
NB-IoT
Using optimised Edge hardware, even possible to be powered by ba_ery
AI at the Edge
Why the hassle?
30. ML-accelerated
processor
a class of microprocessor
designed as hardware
acceleration for AI
applications, e.g. for
neural networks, machine
vision and machine
learning
Example:
GPU
FPGA
ASIC
DSP
Development
Board /Accessories
a standalone computing
system with ML-
accelerated processor
(main or co-) as a main or
additional development
unit
Example:
Coral DevBoard or USB
Accelerator
Intel Neural Compute Stick
Sipeed Maix
Arduino Nano 33 BLE Sense
ESPectro32
SoKware Tools
ML compilers
ML Model design tool
ML Model conveber
SDK & libraries
OS
Example:
TensorFlow & TensorFlow
Lite (TFLite)
TFLite for Microcontroller
OpenVino
Kendryte’s nncase
EdgeImpulse
Suppos
Documentation
Model Zoo
Examples
Datasheets
Hardware SoKware
What do we need?
31. Optimize &
Conve
(compress, remove,
replace, conve)
Train ML Model
(on your awesome
machine or Cloud)
.h5
.pb
.cabemodel
ONNX
ML Model
Intermediate
Representation
(IR) File
Transfer IR qle
to Edge device
Inference
(Edge-optimized
Inference library)
Inference
Result
AI at the Edge General Pipeline
32. Optimize
(TOCO)
Train ML Model
(on your awesome
machine or Cloud)
.h5
.pb
TinyML Pipeline
TFLite
Flat
Buber
Conve
(xxd)
Integrate
into MCU
Firmware
C Byte
Array
33. Optimize
(TOCO)
Train ML Model
(on your awesome
machine or Cloud)
.h5
.pb
TinyML Pipeline
TFLite
Flat
Buber
Conve
(xxd)
Integrate
into MCU
Firmware
C Byte
Array
Custom
Logics
ML Model
Microcontroller (MCU) Firmware
TensorFlow Lite
for Microcontroller
34. TinyML Pipeline
e.g. leveraging Azure Machine Learning
Azure Machine Learning Pipeline
TensorFlow model Yle
Compress
(toco)TFLite
Flat
Buber
Conve
(xxd)
C Byte
ArrayIntegrate
into MCU
Firmware
TinyML Pipeline
35. Typical TinyML Model Training
Labelled
Raw Data
Label1
Label2
Label3
Signal
Processing
(Low/High Pass Filter,
FFT)
Neural
Network
Output
Features
Credit: EdgeImpulse.com. Check it out!
Processing Blocks
To extract features
Learning Blocks
To classify new data.
NN size can be signiqcantly small
due to the extracted features from
previous blocks
AKer Filter
Frequency Domain
40. Train Model
If necessary. Then compress and conveb the model…. Or use pre-trained model, instead
Reference: h_ps://github.com/tensollow/tensollow/tree/master/tensollow/lite/micro/examples/magic_wand/train
41. Enjoy!
Open Serial Monitor to see the inference result
Recognised gestured also displayed on LED matrix*
* On latest source code, display is already rotated :)
42. Other Demo
Similar demo, but using Arduino Nano 33 BLE Sense board
hOps://www.youtube.com/watch?v=Lfv3WJnYhX0 hOps://github.com/andriyadi/MagicWand-TFLite-Arduino
43. TinyML:
TensorFlow Lite for Microcontrollers: h_ps://www.tensollow.org/lite/
microcontrollers
Great book: TinyML, by Daniel Situnayake, Pete Warden
Azure:
Azure Machine Learning: h_ps://azure.microsoJ.com/en-in/services/
machine-learning-service/
DycodeX:
ESPectro32 dev board: h_ps://shop.makestro.com/product/espectro32-
v2/
SMARTernak: h_ps://smabernak.com
Other IoT + AI products & solutions: h_ps://dycodex.com
Contact me:
andri (at) dycodex.com
h_p://github.com/andriyadi | hOps://slideshare.net/andri_yadi/
Call to Action
44. Want to put “AI” in “BrAIns”?
… a.k.a. adopting AI + IoT -> AIoT?
45. AI + IoT enabler
Keep in touch
hi (at) dycodex.com | https://dycodex.com
Bandung, Indonesia