1. RGE: A Repulsive Graph Rectification for Node
Classification via Influence
Jaeyun Song*, Sung-Yub Kim*, Eunho Yang
ICML 2023
Graduate School of AI, KAIST
2. Introduction
● Graph Neural Networks (GNNs) are susceptible to structural noises in graphs.
● Chen et al. (2022a) proposed an influence function for individual edges to estimate
the counterfactual effects of removing them.
● We first identify the group influence estimation error of Exhaustive Group
Elimination (EGE), that eliminates all opponent edges at once and propose
Repulsive Group Elimination (RGE), that removes distant edges over neighboring
edges to reduce group effect in GNNs.
A single harmful edge (dotted) can negatively affect the
predictions of multiple nodes (colored).
3. Problem
• Our goal is to find opponent edges 𝐸′ ⊆ 𝐸 that can improve the test performance when
remove the original graph 𝐺 = (𝑉, 𝐸) . Formally, this can be formulated as
• To avoid the combinatorial explosion of selecting an edge set, we consider a simple
strategy, called Exhaustive edge Group Elimination (EGE):
• The key assumption for the optimality of EGE is the additivity of individual influences.
Graph without 𝐸′
Influence of removing 𝐸′
4. Problem
• Even though the influence additivity cannot be guaranteed in general graphs, we found
the influence additivity can be established between distant edges for the widely used
GNNs, including GCN, SGC, GAT, and GraphSAGE.
• Proposition 3.1 (Influence additivity for distant edges) Let edges 𝑒, 𝑒′
∈ 𝐸 be distant edges
and, for them, assume that the removal of edge 𝑒 does not change the gradient of
unaffected train node 𝒗 :
for 𝑣 ∈ 𝑉𝑇𝑟 and 𝑣 ∉ 𝑉𝑇𝑟(𝑒). Then, the influence additivity holds.
• Detailed proofs can be found in Appendix B!
5. Method Overview
• The overview of our method
• Compute the influence for all edges
• Remove non-adjacent edges (distant edges) with negative influence
• Retrain GNNs on graphs without those edges
Repeat these processes until there are no more negative edges
labeled nodes unlabeled nodes non-negative edges negative edges
Remove negative
distant edges
& Retrain GNNs
Repeat processes
Rectified graph
6. Method
• Repulsive selection rule & Multi-step edge elimination (under 1-layer SGC)
labeled nodes unlabeled nodes non-negative edges negative edges
Choose the most
negative edge
Choose the second
most negative
non-adjacent edge
selected edges
1-st Iteration
Remove edges
& Retrain GNNs
Choose the most
negative edge
2-nd Iteration
Remove edges
& Retrain GNNs
Repeat processes
7. Experiment: Group Influence Estimation Errors
• RGE significantly decreases the group influence estimation errors
EGE RGE
Group Influence Estimation Errors
9. Experiment: Performance under Other Architectures
• Rectified graphs under SGC are also effective on other architectures.
10. Conclusion
• We demonstrate that removing opponent edges simultaneously can increase the
estimation errors of group influence, which might result in performance drops
• We propose a new approach to rectify graphs via multiple steps while reducing group
influence estimation errors by removing distant edges at each step
• We show that our method reduces group influence estimation errors and exhibits
superior performance compared to baselines