Scaling Deep Learning Models for Large Spatial Time-Series Forecasting
1. Scaling Deep Learning Models for Large Spatial
Time-Series Forecasting
Zainab Abbas1, Jon Reginbald Ivarsson1, Ahmad Al-Shishtawy2 and Vladimir Vlassov1
1 KTH Royal Institute of Technology
2 RISE SICS
Stockholm, Sweden
IEEE BIGDATA 2019, LOS ANGELES, DEC 9-12, 2019
2. Challenge of Scale
โ Deep Neural networks are used for different machine learning tasks, such as
spatial time-series forecasting.
โ At scale, training deep NNs is computationally and memory intensive.
โ Partitioning and distribution is a general approach to the challenge of scale in
NN-based modelling
โ dividing the problem into smaller tasks.
โ these tasks comprise of smaller models working on subsets of data
2
3. Traffic Data
โ Sensor ID
โ GPS coordinates
โ Time
โ Flow (no. of cars per minute)
โ Average Speed (Km per hr)
Flow / Speed = Density (Cars per Km)
3Scaling Deep Learning Models for Large Spatial Time-Series Forecasting
4. Large amount of sensor data
โ Radar sensors deployed on
Stockholm highways
โ More than 88 millions data
points collected by 2058
sensors
โ Number of sensors
increasing every year
โ Sensor values per minute
4Scaling Deep Learning Models for Large Spatial Time-Series Forecasting
5. Research Questions
โ How to partition spatial time-series while preserving dependencies among
them?
โ Which and how many spatial time-series do we take into account for a fast and
accurate forecast?
5
Scaling Deep Learning Models for Large Spatial Time-Series Forecasting
6. Graph Representation
โ The traffic sensors are represented in the
form of a directed weighted graph
โ Sensors that are at the same location but
on different lanes are represented as a
vertex
โ The paths between sensors are
represented as edges
โ An edge is weighted with the travel time
between the corresponding sensors
6
7. Graph Partitioning
7
2) Creation of Base Partitions
4) Additions of Partitions from Front and Behind.3) Creation of Base Partitions Graph
Do backward traversal from the
starting vertex in the graph till the
threshold is met.
1) Directed Weighted Graph
9. Stacked LSTMs
โ Stacked LSTMs are build of multiple layers of LSTM
placed over each other
โ More powerful and deep neural network compared to
the conventional architecture
โ Capable to learn non-linear dependencies in the data
9Scaling Deep Learning Models for Large Spatial Time-Series Forecasting
10. Technology
โ Apache Spark 2.4.0
โ Python 2.7.15
โ Tensorflow 1.11.0
10Scaling Deep Learning Models for Large Spatial Time-Series Forecasting
21. Thank you :)
21Scaling Deep Learning Models for Large Spatial Time-Series Forecasting
Zainab Abbas
zainabab@kth.se
Vladimir Vlassov
vladv@kth.se
Ahmad Al-Shishtawy
ahmad.al-shishtawy@ri.se
Jon R. Ivarsson
mail@reginbald.com
22. Scaling Deep Learning Models for Large Spatial
Time-Series Forecasting
Zainab Abbas1, Jon Reginbald Ivarsson1, Ahmad Al-Shishtawy2 and Vladimir Vlassov1
1 KTH Royal Institute of Technology
2 RISE SICS
Stockholm, Sweden
IEEE BIGDATA 2019, LOS ANGELES, DEC 9-12, 2019