9. “we claim that when designing model for recommendation, it is
important to perform rigorous ablation studies to be clear about
the impact of each operation. Otherwise, including less useful
operations will complicate the model unnecessarily, increase the
training di
ffi
culty, and even degrade model e
ff
ectiveness.”
“the deterioration of NGCF stems from the training di
ffi
culty,
rather than over
fi
tting”
Ablation Study - Conclusion
15. Rationale(1) - Self-Connection is implied.
Define :
Then :
By Binomial Theorem
Same as
previous slide :
Actually, This comes from
“Simplifying Graph Convolutional Networks”
19. Training Recipe
- Loss : BPR
- Optimizer : Adam
- Do not use Dropout
- Layer Combination coefficient: 1/(K + 1)
20. Experiments
- Comparison with NGCF
- Comparison with SOTA
- Ablation Studies
- Impact of Layer Combination
- Impact of Symmetric Sort Normalization
- Analysis of Embedding Smoothness
- Hyper-Parameter Study