ICT Role in 21st Century Education & its Challenges.pptx
YouSz_20210421.pptx
1. Learning IOT in Edge: Deep
Learning for the Internet of
Things with Edge Computing
He Li, Karol Ota, and Mianxiong Dong
[11] H. Li, K. Ota, and M. Dong, “Learning IoT in edge: Deep learning for the Internet of
Things with edge computing,” IEEE Netw., vol. 32, no. 1, pp. 96–101, Jan./Feb. 2018.
1
2. Introduction
• Deep learning plays an important role in IOT services
• Edge computing is an important technology for IOT services
• Deep learning model is appropriate for the edge computing
• Privacy is preserved in intermediate data transferring
• Improve learning performance and reduce network traffic
2
3. Motivation
• There are different intermediate data size and preprocessing
overhead with different deep learning models
• Try to guarantee the QoS of each deep learning service for
IoT in the scheduling
3
4. Contributions
• Introduced deep learning for IoT into edge computing
environment
• Formulated an model for varying deep learning models for
IoT in edge computing
• Designed efficient algorithms to optimize the service
capacity of the edge computing model
• Test the deep learning model for IoT with extensive
experiments in a given edge computing environment
4
5. Related Work-Deep learning for IOT
• Important benefits of deep learning over machine learning:
1. Better performance with large data scale
2. Extract new features automatically for different
problems
3. Takes less time to inference information
5
6. Related Work-Deep learning for IOT
• Execute deep learning task in IoT devices
• Lane et al.[10] DeepEar and DeepX
6
[10] N. D. Lane, P. Georgiev, and L. Qendro, “Deepear: Robust Smartphone Audio
Sensing in Unconstrained Acoustic Environments Using Deep Learning,” Proc.
2015 ACM Int’l. Joint Conf. Pervasive and Ubiquitous Computing, 2015, pp.
283-94.
7. Related Work-Deep learning for IOT
• Introduce deep learning into more IoT applications[11]
• Bhattacharya et al.[12] New deep learning model for
wearable IOT devices
7
[11] L. Li et al., “Eyes in the Dark: Distributed Scene Understand- ing for Disaster
Management,” IEEE Trans. Parallel Distrib. Systems, 2017. DOI:
10.1109/TPDS.2017.2740294.
[12] S. Bhattacharya and N. D. Lane, “Sparsification and Sepa- ration of Deep
Learning Layers for Constrained Resource Inference on Wearables,” Proc. 14th
ACM Conf. Embedded Network Sensor Systems CD-ROM, ser. SenSys ‘16, 2016,
pp. 176–89.
8. Related Work-Deep learning for IOT
• Cloud-assisted deep learning applications
• Alsheikh et al.[13] A framework combine deep learning
algorithms and Apache Spark for IOT data analytics
• It is very similar to edge computing
8
[13] M. A. Alsheikh et al., “Mobile Big Data Analytics Using Deep Learning and Apa-
che Spark,” IEEE Network, vol. 30, no. 3, May/June 2016, pp. 22–29.
9. Related Work-Deep learning and edge
computing
• Edge computing brings improvements to cloud computing:
1. Preprocess large amounts of data before transferring
them to central servers
2. The cloud resources are optimized
• Liu et al. Food recognition application
9
10. Deep learning for IOT in edge computing-A
video sensing scenario
• The cameras monitoring the
environment and collect the
video data
• Then the cameras transfer the
collected data to the IOT
gateway
• IOT gateway forward all collected
data to the cloud server after
processing the raw data
10
11. Deep learning for IOT in edge computing-A
video sensing scenario
• The input data will be processed in the deep learning layers
• Each layer processes the intermediate feature from previous
layer and generates new features
• The features generated by the last layer will be processed by
a classifier and recognized as the output
11
12. Deep learning for IOT in edge computing-
Problem
• Deep learning improves the efficiency of multimedia
processing for IOT services
• The communication performance will be the bottleneck
12
13. Deep learning for IOT in edge computing-
Solution: Edge computing
• Edge layer: IOT devices, IOT gateway, Network access points
in LAN
• Cloud layer: Internet connections, Cloud servers
• Processing will be performed in the edge layer
• Only intermediate data or results need to be transferred, the
pressure on the network is relieved
13
14. Deep learning for IOT in edge computing-An edge
computing structure for IOT deep learning tasks
14
15. Deep learning for IOT in edge computing-How
to divide each deep learning network
• Deploying more layers into edge servers can reduce more
network traffic
• The server capacity of edge servers is limited compared to
cloud servers
• Every layer will bring additional computational overhead to
the server
• Different deep learning networks and tasks have different
size of intermediate data and computational overhead
• Efficient scheduling is needed
15
17. Scheduling problem and solution-Offline
algorithm
1. Find out :
• : maximizes
• : largest input data of task
2. Sort all tasks in ascending order of the largest input data
size
3. Deploy task with minimum input data size to edge servers
17
18. Scheduling problem and solution-Offline
algorithm
• Check if all edge servers have enough service capability and
network bandwidth to deploy
• Enough: Deploy into all edge servers
• An edge server not enough: Change and find out an
appropriate
• Not enough even after varying : Not deploy
18
19. Scheduling problem and solution-Online
algorithm
• The task scheduling has little information of future tasks, the
deployment decision is based on the historical task
• Find out and
• Define:
• If , and other edge
servers have enough bandwidth and service capacity,
deploys into edge servers
19
20. Performance evaluation-Collecting data from
deep learning tasks
• Use Caffe as the CNN framework
• Define 10 different CNN networks
• Execute 10 CNN tasks with different CNN networks
• Record the number of operations and intermediate data
generated in each layer
20
21. Performance evaluation-Collecting data from
deep learning tasks
• The input data can be
reduced by the deep
learning networks
• More intermediate data are
reduced by lower layers
• Computational overhead is
increased quickly with more
layers
Blue plots: reduced data size ratio
Red plots: computational overhead
21
23. Performance evaluation-Simulation: FIFO vs
Online algorithm and LBF vs Online algorithm
• When number of input tasks is
near 600, the online algorithm
deploys more tasks than FIFO
• When number of inout tasks is
near 800, the online algorithm
deploys more tasks than LBF
• Their online algorithm
outperforms FIFO and LBF over a
long time period 23
24. Conclusion
• The deep learning model is appropriate for the edge
computing environment
• The performance evaluation state that their solutions can
increase the number of tasks deployed in edge servers with
guaranteed QoS requirements
24
25. Discussion
• Why the algorithm have to deploy a task in all edge servers?
1. For better performance
2. Set a threshold
25
An open problem in IoT is how to mine real world data from a complex and noisy environment
Since deep learning is a strong analytic tools for huge volumes data
It is considered as the most promising approach to sole this problem
Deep learning plays an important role in IoT service due to its high efficiency in studying complex data
Edge computing is another important technology for IoT services
The centralized cloud computing structure is becoming inefficient when processing and analyzing a large amounts of data, due to transferring data with limited network performance
We can offload some computing tasks from centralized cloud to the edge near IoT devices
Edge computing can perform well when the intermediate data is smaller than the input data
And a typical deep learning model usually has lots of layers in learning network
So the intermediate data can be quickly scaled down by each layers
Therefore, the deep learning model is very appropriate for the edge computing environment
We can offload parts of the layers in the edge and than transfer the reduced intermediate data to the centralized cloud server
another advantages of deep learning with edge computing is the privacy preserving in intermediate data transferring
Because the intermediate data in deep learning usually have different semantics compared to the source data
Thus, in this article, they introduce deep learning for IoT into the edge computing to improve learning performance and reduce network traffic
Because there are different intermediate data size and preprocessing overhead with different deep learning models
So they state a scheduling problem to maximize the number of deep learning tasks with the limited network bandwidth and service capability of edge nodes
they also try to guarantee the quality of service of each deep learning service for IoT in scheduling
The main contributions of this article are as follow
First, they introduce deep learning for IoT into edge computing environment
Second, they formulate an elastic model for varying deep learning models for IoT in edge computing
Third, they design efficient algorithms to optimize the service capacity of the edge computing model
Finally, they test the deep learning model for IoT with extensive experiments in a given edge computing environment
And they introduce some related technologies on deep learning for IoT
Deep learning is an emerging technology for IoT applications and systems
The important benefits of DL over ML are:
It has Better performance with large data scale since many IoT applications generate a large amount of data for processing
It can extract new features automatically for different problems
And it takes much less time to inference information than traditional machine learning
Because of limited energy and computing capability, executing deep learning task in IoT device is an big issue. General commercial hardware and software may fall short for it.
Lane et al. proposed new acceleration engines, such as Deep Ear and DeepX, to support different deep learning applications in the latest mobile systems on chips. From the experimental results, mobile IoT devices with high-spec can support part of the learning process.
Introducing deep learning into more IoT applications is another important issue
Bhattacharya et al. proposed a new deep learning model for wearable IoT devices that improves the accuracy of audio recognition tasks.
Most existing deep learning applications (e.g., speech recognition) still need to be cloud-assist- ed. Alsheikh et al. [13] proposed a framework to combine deep learning algorithms and Apache Spark for IoT data analytics. The inference phase is executed on mobile devices, while Apache Spark is deployed in cloud servers for supporting data training.
This two-layer design is very similar to edge computing, which shows that it is possible to offload processing tasks from the cloud.
Edge computing brings two major improvements to the existing cloud computing.
The first is that edge nodes can
preprocess large amounts of data before trans-
ferring them to the central servers in the cloud
The other is the cloud resources are optimized by enabling edge nodes with computing ability
Due to these improvements, the aforementioned problems of the cloud infrastructure can be well addressed.
Liu et al. proposed a deep-learning-based food recognition application by employing edge-computing-based service infrastructure.
Their work shows that edge computing can improve the performance of deep learning applications by reducing response time and lowering energy consumption
They use a video recognition IoT application as the example to intoduce
Several wireless video cameras monitoring the environment and collect the video data
Then the cameras transfer the collected data to the IOT gateway through general WiFi connections
IOT gateway forward all collected data to the cloud server through Internet communication after processing the raw data
The input data will be processed in the deep learning layers
Each layers processes the intermediate feature generated by previous layer and then generates new features
Finally, the features generated by the last layer will be processed by a classifier and recognized as the output
Though deep learning improves the efficiency of multimedia processing for IoT services since features are extracted by multiple layers instead of traditional complex preprocessing
The communication performance will be the bottleneck
Because the collected multimedia data size is much larger than traditional structured data size, but it is hard to improve the performance of the network for transferring collected data from IoT devices to the cloud service
Edge computing is a possible solution to the problem
In the IoT network, there are two layers, edge layer and cloud layer. Edge layer consist of iot device, an iot gateway, and network access point in local area networks, and the cloud layer consist of the internet connection and cloud servers.
The processing can be performed in the edge layer, instead of the cloud layer.
Since only intermediate data or results need to be transferred from the devices to the cloud server, the pressure on the network is relieved with less transferring data
They present an edge computing structure for IoT deep learning tasksThe structure also consist of edge layer and cloud layerIn the edge layer, the edge servers are deployed in IoT gateway for processing dataThey first train the deep learning networks in the cloud serverAfter the training phase, they divide the learning networks into two partsOne part includes lower layers while another includes higher layersThey deploy the part with lower layers into edge servers, and the other parts with higher layers into the cloud se rêver for offloading processingThus, the collected data are input into the first layer in the edge serverThe edge servers load the intermediate data from lower layers and then transfer data to the cloud server as the input data for the higher layers
A problem is that how to divide each deep learning networkDeploy more layers into edge servers can reduce more net work traffic since the size of intermediate data generated by higher layers is usually smaller than that generated by the lower layersHowever, the server capacity of edge servers are limited compared to cloud serversIt is impossible to process infinite tasks in edge serversAnd also, every layer in a deep learning network will bring additional computational overhead to the server.So we can only deploy part of the deep learning net work into edge serversMeanwhile, as different deep learning networks and tasks have different size of intermediate data and computational overhead, an efficient scheduling is needed to optimize deep learning for IoT in the edge computing structure
The scheduling problem attempts to assign maximum tasks in the edge computing structure by deploying deep learning layers in IoT edge servers so the required transferring latency of each task can be guaranteedAnd it is denoted by :And it is subject to theses inequality:The first one is for bandwidthLatencyComputational overhead
They propose an offline algorithm and a online algorithm to solve the problemFirst introduce the offline algorithmFirst find out kmj which maximize the value of rkj multiply lkj and edge server imj, which has the largest input data of task tj Then the algo. Sorts all tasks in ascending order of the largest input data sizeThe scheduling first deploys task tj with minimum input data size to edge servers
It traverses all edge servers to check wether an edge server has enough service capability and network bandwidth to deploy task tj If all edge servers have enough bandwidth and service capacity, it deploys task tj into all servers.If an edge server doesn’t have enough bandwidth and service capacity, it changes the value of k and find out an appropriate k for deploying task tj in all edge serversIf the edge server can’t make it even after varying k, it will not deploy task tj in edge servers
They also proposed an online algorithm When a task tj is coming, as the task scheduling has little information about future tasks, the deployment decision is based on the historical tasksFirst, they calculate kmj and imj Then they define a value f(cimj) and it’s equal to thisWhere Bmax and Bmin denote the maximum and minimum required bandwidth of a task cimj is t If this inequality is satisfied and other edge servers have enough bandwidth and service capacity, the scheduling algorithm deploys task tj into the edge servers
In the experiments , they have two environments, one for collecting data from deep learning tasks and another for simulation.
For collecting data from deep learning tests, they use Caffe as the CNN framwork and define 10 CNN networks and the execute 10 CNN tasks with different CNN networks and record the number of operations and intermediate data generated in each layer
As shown in the figure, they choose two deep learning networks as the example. Blue plots is/ red plot is
From the plot, we can see that The input data can be reduced by the deep learning networks, and More intermediate data are reduced by lower layers
And also, Computational overhead is increased quickly with more layers
And then they develop a simulation.
They first test the offline algorithm.
They compared the performance with the fixed mode that has fixed number of deep learning layers in the edge servers
As shown in the figure, The scheduling algorithm outperforms the fixed mode
And then they test the online algorithm
They compared the algorithm to two popular online scheduling algorithm, FIFO and Loe band width first.
As shown in figure, When number of input tasks is near 600, the online algorithm deploys more tasks than FIFO, When number of inout tasks is near 800, the online algorithm deploys more tasks than LBF
And they state that their online algorithm outperforms FIFO and LBF over a long time period
In this article, they introduce deep learning for IoT into the edge computing environment to optimize network performance and protect user privacy in uploading data. And state that The deep learning model is appropriate for the edge computing environment
They also consider the limited service capability of edge nodes and propose algorithms to maximize the number of tasks in the edge computing environment.
And The performance evaluation state that their solutions can increase the number of tasks deployed in edge servers with guaranteed QoS requirements
Just for better performance like reduce the latency and improve the reliability
But I think maybe they can set a threshold so that they can deploy more tasks