Delivering Container-based Apps to IoT Edge devices
May. 28, 2021•0 likes
0 likes
Be the first to like this
Show More
•914 views
views
Total views
0
On Slideshare
0
From embeds
0
Number of embeds
0
Download to read offline
Report
Technology
I presented it during Dockercon. This talk was all about AI + Docker + IoT. Showcased how Docker app talk to Sensors, GPUs and Camera module and demo'ed how sensors data can be visualized over Grafana dashboard - all running on a IoT Edge device.
- Docker Captain
- ARM Innovator
- Author @ collabnix.com
- Docker Community Leader
- DevRel at Redis Labs
- Worked in Dell, VMware & CGI
About Me
Ajeet Singh Raina
- The Rise of Docker for AI
- Autonomous Robotic Platform
- Docker on IoT Edge
- IoT Edge Sensor Analytics
- Real-time video analytics
- Real-time Crowd Mask detection
Agenda
Around 94% of AI Adopters are using or plan to
use containers within 1 year time.
Source: 451 Research
A Food Delivery Robot
- An autonomous robot system
- Camera
- Sensors
- GPS
- NVIDIA Jetson TX2
A Food Delivery Robot
- An autonomous robot system
- Camera
- Sensors
- GPS
- NVIDIA Jetson board
But how shall I build app for such
robotic platform in a faster pace?
Docker on NVIDIA Jetson Nano
Build on Open Source
● Install the latest version of Docker
curl https://get.docker.com | sh
&& sudo systemctl --now enable docker
● Setting NVIDIA Container Toolkit
distribution=$(. /etc/os-release;echo $ID$VERSION_ID)
&& curl -s -L https://nvidia.github.io/nvidia-docker/gpgkey | sudo apt-key add -
&& curl -s -L https://nvidia.github.io/nvidia-docker/$distribution/nvidia-docker.list | sudo tee
/etc/apt/sources.list.d/nvidia-docker.list
● Install nvidia-docker2 package
$ sudo apt-get install -y nvidia-docker2
$ sudo systemctl restart docker
● Running Ubuntu ARM container
docker run -it arm64v8/ubuntu /bin/bash
Docker access to NVIDIA GPU
Build on Open Source
● Pre-requisite
$ apt-get install nvidia-container-runtime
● Expose GPU for use
$ docker run -it --rm --gpus all ubuntu nvidia-smi
● Specify the GPUs
$ docker run -it --rm --gpus
device=GPU-3a23c669-1f69-c64e-cf85-44e9b07e7a2a ubuntu
nvidia-smi
● Set NVIDIA capabilities
$ docker run --gpus 'all,capabilities=utility' --rm
ubuntu nvidia-smi
Enabling GPU access with Compose
Build on Open Source
● Compose services can define GPU device reservations
services:
test:
image: nvidia/cuda:10.2-base
command: nvidia-smi
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1
capabilities: [gpu, utility]
ia-smi
● Specify the GPUs
$ docker-compose up
Creating network "gpu_default" with the default driver
Creating gpu_test_1 ... done
Attaching to gpu_test_1
test_1 |
+-----------------------------------------------------
------------------------+
test_1 | | NVIDIA-SMI 450.80.02 Driver Version:
450.80.02 CUDA Version: 11.1 |
test_1 |
|-------------------------------+---------------------
-+----------------------+
======================================================
==============|
Do I need ARM-based devices to
build ARM-based images?
Not necessarily.
Buildx - A CLI Plugin to build Multi-arch Docker Images
BUILD SHIP RUN
$ docker buildx build --platform linux/arm/v7, linux/arm/v8 -t
IoT Sensor Analytics
BME680 Sensors + Docker + RedisTimeSeries + Grafana on Jetson Nano
Real-Time Video Analytics
Docker + Computer Vision + RedisEdge
Smart Camera system for Real-Time Crowd Face Mask detection
Docker + Edge AI + Camera
Real-time Implementations
Building IoT Sensor Analytics
Ingest UI
App
RedisTimeSeries
Sensors/IoT
Redis Data Source for Grafana