Introduction to Multiple Object Tracking
YANG FAN
D3 Candidate
Nara Institute of Science and Technology
1
Multiple Object Tracking (MOT)
1. Goal: Detect and link target objects in the spatio-temporal domain.
Using masks
(only person)
Using boxes
(only person)
Using boxes
(person, car,
bike, etc.)
2
MOT Approach
2. Approach:
MOT is a complicated system
Basically, we aim to connect the similar detections crossing frames.
3
2. Approach:
MOT Approach Simplified Version
Frame k
Frame k+1
a b
c d
0.2 0.8
0.7 0.3
a b
c
d
Appearance Embedding
a
b
c
d
Appearance
Re-identification
Module
distance
4
Using frames in the future
Online MOT Offline MOT
YesNo
Detection & Data Association
One-stage MOT
e.g., CenterTrack
FairMOT, JDE
Detection model
+ ReID model
a single model
(real time)
Two-stage MOT
e.g., DeepSORT
MOT Category
Using tracklet label for training
Unsupervised MOT Supervised MOT
YesNo
Data association methods
End-to-end MOT ? MOT
e.g. Hungary
Algorithm
e.g.,
5
MOT Category
MOT (using boxes) MOTS (using mask)
6
MOT Category
Single-class MOT (person) Multiple-class MOT (car, truck, bike, etc.)
7
MOT Category
3D 360-degree-view MOT 2D Single-View MOT
8
MOT Category
Single-camera MOT Multi-camera MOT
9
Old Evaluation Metrics for MOT/MOTS
Numerous of metrics, while MOTA/MOTSA generally are the dominated one.
https://motchallenge.net/
10
Old Evaluation Metrics for MOT/MOTS
Defects of MOTA/MOTSA: the object detection performance can overwhelm the data
association performance.
The improvement on most of MOT datasets may benefit from the improvement of object
detection.
With Ground-truth Detection
With Estimated Detection
11
Old Evaluation Metrics for MOT/MOTS
Defects of MOTA/MOTSA: the object detection performance can overwhelm the data
association performance.
The improvement on most of MOT datasets may benefit from the improvement of object
detection.
Object Detection
MOT
Some people used high-quality detection to compare with others that use low-quality
detection.
12
New Evaluation Metrics for MOT/MOTS
Sequence
TrackAP50: 1
TrackAP75: 1
TrackAP95: 1
TrackAP50: 1
TrackAP75: 0
TrackAP95: 0
Credit to Dave, A., Khurana, T., Tokmakov, P., Schmid, C. & Ramanan, D. TAO: A Large-Scale Benchmark for Tracking Any Object. In European Conference
on Computer Vision, 2020
However, these metrics have not been widely accepted yet.
13
STATE-OF-THE-ART MOT
3D MOT 2015 Results
2D MOT 2015-2017 Results
14
STATE-OF-THE-ART MOT
MOTS 2020 Results
BDD 100K MOT Results
15
STATE-OF-THE-ART MOT
There are another 20+ large-scale MOT datasets, including
Waymo dataset,
nuScenes MOT dataset,
KITTI MOT dataset,
Omni-MOT Dataset,
VisDrone dataset,
TAO dataset,
Panda Dataset,
etc.
But MOT Challenge (e.g., MOT15-20) and KITTI MOT are still the
most popular ones.
16
Current Research Environment for MOT
MOT Research Environment
MOT Researcher
MOT is interesting and useful, but it needs more effects to rethink its evaluation metrics,
unify the evaluation protocols.

Introduction to multiple object tracking

  • 1.
    Introduction to MultipleObject Tracking YANG FAN D3 Candidate Nara Institute of Science and Technology
  • 2.
    1 Multiple Object Tracking(MOT) 1. Goal: Detect and link target objects in the spatio-temporal domain. Using masks (only person) Using boxes (only person) Using boxes (person, car, bike, etc.)
  • 3.
    2 MOT Approach 2. Approach: MOTis a complicated system Basically, we aim to connect the similar detections crossing frames.
  • 4.
    3 2. Approach: MOT ApproachSimplified Version Frame k Frame k+1 a b c d 0.2 0.8 0.7 0.3 a b c d Appearance Embedding a b c d Appearance Re-identification Module distance
  • 5.
    4 Using frames inthe future Online MOT Offline MOT YesNo Detection & Data Association One-stage MOT e.g., CenterTrack FairMOT, JDE Detection model + ReID model a single model (real time) Two-stage MOT e.g., DeepSORT MOT Category Using tracklet label for training Unsupervised MOT Supervised MOT YesNo Data association methods End-to-end MOT ? MOT e.g. Hungary Algorithm e.g.,
  • 6.
    5 MOT Category MOT (usingboxes) MOTS (using mask)
  • 7.
    6 MOT Category Single-class MOT(person) Multiple-class MOT (car, truck, bike, etc.)
  • 8.
    7 MOT Category 3D 360-degree-viewMOT 2D Single-View MOT
  • 9.
  • 10.
    9 Old Evaluation Metricsfor MOT/MOTS Numerous of metrics, while MOTA/MOTSA generally are the dominated one. https://motchallenge.net/
  • 11.
    10 Old Evaluation Metricsfor MOT/MOTS Defects of MOTA/MOTSA: the object detection performance can overwhelm the data association performance. The improvement on most of MOT datasets may benefit from the improvement of object detection. With Ground-truth Detection With Estimated Detection
  • 12.
    11 Old Evaluation Metricsfor MOT/MOTS Defects of MOTA/MOTSA: the object detection performance can overwhelm the data association performance. The improvement on most of MOT datasets may benefit from the improvement of object detection. Object Detection MOT Some people used high-quality detection to compare with others that use low-quality detection.
  • 13.
    12 New Evaluation Metricsfor MOT/MOTS Sequence TrackAP50: 1 TrackAP75: 1 TrackAP95: 1 TrackAP50: 1 TrackAP75: 0 TrackAP95: 0 Credit to Dave, A., Khurana, T., Tokmakov, P., Schmid, C. & Ramanan, D. TAO: A Large-Scale Benchmark for Tracking Any Object. In European Conference on Computer Vision, 2020 However, these metrics have not been widely accepted yet.
  • 14.
    13 STATE-OF-THE-ART MOT 3D MOT2015 Results 2D MOT 2015-2017 Results
  • 15.
    14 STATE-OF-THE-ART MOT MOTS 2020Results BDD 100K MOT Results
  • 16.
    15 STATE-OF-THE-ART MOT There areanother 20+ large-scale MOT datasets, including Waymo dataset, nuScenes MOT dataset, KITTI MOT dataset, Omni-MOT Dataset, VisDrone dataset, TAO dataset, Panda Dataset, etc. But MOT Challenge (e.g., MOT15-20) and KITTI MOT are still the most popular ones.
  • 17.
    16 Current Research Environmentfor MOT MOT Research Environment MOT Researcher MOT is interesting and useful, but it needs more effects to rethink its evaluation metrics, unify the evaluation protocols.

Editor's Notes

  • #2 Eizō-chū no kokojin no kōdō ninshiki