For my presentation for a reading group. I have not in any way contributed this study, which is done by the researchers named on the first slide.
https://papers.nips.cc/paper/6418-interaction-networks-for-learning-about-objects-relations-and-physics
Learning about Objects, Relations and Physics with Interaction Networks
1. Interaction Networks for
Learning about Objects,
Relations and Physics
Peter Battaglia, Razvan Pascanu,
Matthew Lai, Danilo Jimenez Rezende,
koray kavukcuoglu (Google DeepMind)
NIPS 2016 Reading Club
Presenter: Ken Kuroki (@enuroi)
1
2. Background & Purpose
• Some attempts to learn physical dynamics so far.
(rigid bodies, fluid dynamics, 3D trajectory etc.)
• This study aims to construct a general-purpose
learnable physics engine.
(that can learn novel physical systems)
2
3. Model at a Glance
3
O1
O2
O1,t O2,t r
fR
et+1
O2,t
fO
et+1
O2,t+1
5. Model in Detail 2
5
NR : number of relations
NO : number of objects
bk : <oi, oj, rk>
(rearranges the objects and relations into interaction terms)
Relation
e: multiple for one object
c: aggregated by a
6. Implementation 1
6
O = Ds
NO
R =
NR
NO
NR
NO
Rr Rs
receiver sender
DR
NR
Ra
attributes
, ,
object1's status vector
8. Implementation 3
8
G, X, E
E = ERr
– T
[O; X; E] = C
–
Ds
Ds
DR
NR
O
X
E
–
fR
a
P = Ot+1
DA
fA
(Free energy)
9. Architecture
• MLP (bias, ReLU)
By hyperparamerter search...
• FR : four 150-length hidden layers, output length 50
• FO : one 100-length hidden layer, output length 2
(x and y velocity)
• FA : one 25-length hidden layer
9
10. Optimization
• Used Adam
Learning rate 0.001, and downscaled by *0.8 for 40
epochs
• L2 regularization
(penalty factor by grid search)
10
11. Training
Simulated 2000 scenes over 1000 time steps
• Training : 1 million sample, for 2000 epochs (mini-
batches of 100 to balance distributions)
• Validation : 200k sample
• Test data : 200k sample
11
13. Comparison
Alternative Models:
1. Constant velocity (output=input)
2. MLP (two 300-length hidden layers)
input: flattened vector of all the input data
3. Interaction Network without E (interaction)
13
15. Discussion
1. Performed better than alternatives
2. Baseline MLP couldn't effectively learn interaction
3. To understand "intuitive physics engine" in human
4. Potential to expand the model
15
16. Presenter's Comments
1. Can be applied to a larger system?
(time & memory-wise)
2. Probably it can be parallelized
3. Really advantageous to alternatives?
16