SlideShare a Scribd company logo
1 of 18
Van Thuy Hoang
Network Science Lab
Dept. of Artificial Intelligence
The Catholic University of Korea
E-mail: hoangvanthuy90@gmail.com
240302
Jun Xia et.al., WWW 22
2
Graph Convolutional Networks (GCNs)
 Generate node embeddings based on local network neighborhoods
 Nodes have embeddings at each layer, repeating combine messages
from their neighbor using neural networks
3
Problems
 Most existing GNNs are trained in a supervised manner and it is often resource-
and time-intensive to collect abundant true-labeled data
 Graph self-supervised learning that learns representations from unlabeled graphs
GCA, GRACE models
GRAph Contrastive rEpresentation learning (GRACE) model, ICML 20
4
Problems
 However, these augmentations are not suitable for all scenarios because the
structural information and semantics of the graphs varies significantly across
domains
 It is difficult to preserve semantics well during augmentations in view of the
diverse nature of graph data.
 Edge perturbation benefits social networks but hurt some biochemical
molecules in GCL
 Dropping a carbon atom in the phenyl ring will alter the aromatic system and
result in an alkene chain, which will drastically change the molecular
properties
5
Solutions
 Can we emancipate graph contrastive learning from tedious manual trial-and-
errors, cumbersome search or expensive domain knowledge?Solutions Solutions
6
Illustration of SimGRACE
 Instead of augmenting the graph data, we feed the original graph G into a GNN
encoder 𝑓 (·; 𝜽) and its perturbed version 𝑓 (·; 𝜽 ′ ).
 After passing a shared projection head 𝑔(·), we maximize the agreement between
representations 𝒛_𝑖 and 𝒛_𝑗 via a contrastive loss
7
SimGRACE differs from others
 (1) SimGRACE perturbs the encoder with a random Guassian noise instead of
momentum updating
 (2) SimGRACE does not require data augmentation while BGRL and MERIT take it
as prerequisite.
 (3) SimGRACE focuses on graph-level representation learning while BGRL and
MERIT only work in node-level tasks.
8
SimGRACE
(1) Encoder perturbation.
The perturbation term which samples from Gaussian distribution with zero mean and variance
9
(2) Projection head
 a two-layer perceptron (MLP) to obtain 𝑧 and 𝑧 ′
10
(3) Contrastive loss
 The normalized temperature-scaled cross entropy loss (NT-Xent) as other works
 Sim(): the cosine similarity function
11
Why can SimGRACE work well
 Two key properties related to contrastive learning: alignment and uniformity and
then propose two metrics to measure the quality of representations obtained via
contrastive learning
 Alignment and uniformity and then propose two metrics to measure the
quality of representations obtained via contrastive learning
12
AT-SimGRACE
 To utilize Adversarial Training (AT) to improve the adversarial robustness of
SimGRACE in a principled way.
 Generally, AT directly incorporates adversarial examples into the training process
to solve the following optimization probl
x′_𝑖 is the adversarial example within the 𝜖-ball
(bounded by an 𝐿𝑝 –norm)
13
Experimental Setup
 For unsupervised and semi-supervised learning:
 benchmark TUDataset, including graph data for various social networks and
biochemical molecules
 For transfer learning:
 pre-training on ZINC-2M and PPI-306K
 finetune the model with various datasets including PPI, BBBP, ToxCast and
SIDER.
14
EXPERIMENTS
15
EXPERIMENTS
 Results for transfer learning setting.
16
EXPERIMENTS
 Hyper-parameters sensitivity analysis
Without perturbation, SimGRACE simply compares two original samples as a
negative pair while the positive pair loss becomes zero, leading to
homogeneously pushes all graph representations away from each other, which is
non-intuitive to justify.
17
Conclusion
 A simple framework (SimGRACE) for graph contrastive learning.
 SimGRACE can outperform or match the stateof-the-art competitors
on multiple graph datasets of various scales and types
 SimGRACE emancipate graph contrastive learning from tedious manual tuning,
cumbersome search or expensive domain knowledge.
240304_Thuy_Labseminar[SimGRACE: A Simple Framework for Graph Contrastive Learning without Data Augmentation].pptx

More Related Content

Similar to 240304_Thuy_Labseminar[SimGRACE: A Simple Framework for Graph Contrastive Learning without Data Augmentation].pptx

Colfax-Winograd-Summary _final (1)
Colfax-Winograd-Summary _final (1)Colfax-Winograd-Summary _final (1)
Colfax-Winograd-Summary _final (1)
Sangamesh Ragate
 
COMPARATIVE PERFORMANCE ANALYSIS OF RNSC AND MCL ALGORITHMS ON POWER-LAW DIST...
COMPARATIVE PERFORMANCE ANALYSIS OF RNSC AND MCL ALGORITHMS ON POWER-LAW DIST...COMPARATIVE PERFORMANCE ANALYSIS OF RNSC AND MCL ALGORITHMS ON POWER-LAW DIST...
COMPARATIVE PERFORMANCE ANALYSIS OF RNSC AND MCL ALGORITHMS ON POWER-LAW DIST...
acijjournal
 

Similar to 240304_Thuy_Labseminar[SimGRACE: A Simple Framework for Graph Contrastive Learning without Data Augmentation].pptx (20)

論文紹介:Learning With Neighbor Consistency for Noisy Labels
論文紹介:Learning With Neighbor Consistency for Noisy Labels論文紹介:Learning With Neighbor Consistency for Noisy Labels
論文紹介:Learning With Neighbor Consistency for Noisy Labels
 
COMPARISON OF VOLUME AND DISTANCE CONSTRAINT ON HYPERSPECTRAL UNMIXING
COMPARISON OF VOLUME AND DISTANCE CONSTRAINT ON HYPERSPECTRAL UNMIXINGCOMPARISON OF VOLUME AND DISTANCE CONSTRAINT ON HYPERSPECTRAL UNMIXING
COMPARISON OF VOLUME AND DISTANCE CONSTRAINT ON HYPERSPECTRAL UNMIXING
 
240318_Thuy_Labseminar[Fragment-based Pretraining and Finetuning on Molecular...
240318_Thuy_Labseminar[Fragment-based Pretraining and Finetuning on Molecular...240318_Thuy_Labseminar[Fragment-based Pretraining and Finetuning on Molecular...
240318_Thuy_Labseminar[Fragment-based Pretraining and Finetuning on Molecular...
 
Graph Neural Prompting with Large Language Models.pptx
Graph Neural Prompting with Large Language Models.pptxGraph Neural Prompting with Large Language Models.pptx
Graph Neural Prompting with Large Language Models.pptx
 
240311_Thuy_Labseminar[Contrastive Multi-View Representation Learning on Grap...
240311_Thuy_Labseminar[Contrastive Multi-View Representation Learning on Grap...240311_Thuy_Labseminar[Contrastive Multi-View Representation Learning on Grap...
240311_Thuy_Labseminar[Contrastive Multi-View Representation Learning on Grap...
 
NS-CUK Seminar: S.T.Nguyen, Review on "Do We Really Need Complicated Model Ar...
NS-CUK Seminar: S.T.Nguyen, Review on "Do We Really Need Complicated Model Ar...NS-CUK Seminar: S.T.Nguyen, Review on "Do We Really Need Complicated Model Ar...
NS-CUK Seminar: S.T.Nguyen, Review on "Do We Really Need Complicated Model Ar...
 
Implementing a neural network potential for exascale molecular dynamics
Implementing a neural network potential for exascale molecular dynamicsImplementing a neural network potential for exascale molecular dynamics
Implementing a neural network potential for exascale molecular dynamics
 
Mlp mixer image_process_210613 deeplearning paper review!
Mlp mixer image_process_210613 deeplearning paper review!Mlp mixer image_process_210613 deeplearning paper review!
Mlp mixer image_process_210613 deeplearning paper review!
 
240324_Thuy_Labseminar[GeoTMI: Predicting Quantum Chemical Property with Easy...
240324_Thuy_Labseminar[GeoTMI: Predicting Quantum Chemical Property with Easy...240324_Thuy_Labseminar[GeoTMI: Predicting Quantum Chemical Property with Easy...
240324_Thuy_Labseminar[GeoTMI: Predicting Quantum Chemical Property with Easy...
 
NS-CUK Seminar: S.T.Nguyen, Review on "Improving Graph Neural Network Express...
NS-CUK Seminar: S.T.Nguyen, Review on "Improving Graph Neural Network Express...NS-CUK Seminar: S.T.Nguyen, Review on "Improving Graph Neural Network Express...
NS-CUK Seminar: S.T.Nguyen, Review on "Improving Graph Neural Network Express...
 
1809.05680.pdf
1809.05680.pdf1809.05680.pdf
1809.05680.pdf
 
[IJCAI 2023] SemiGNN-PPI: Self-Ensembling Multi-Graph Neural Network for Effi...
[IJCAI 2023] SemiGNN-PPI: Self-Ensembling Multi-Graph Neural Network for Effi...[IJCAI 2023] SemiGNN-PPI: Self-Ensembling Multi-Graph Neural Network for Effi...
[IJCAI 2023] SemiGNN-PPI: Self-Ensembling Multi-Graph Neural Network for Effi...
 
NS-CUK Seminar: V.T.Hoang, Review on "Namkyeong Lee, et al. Relational Self-...
NS-CUK Seminar:  V.T.Hoang, Review on "Namkyeong Lee, et al. Relational Self-...NS-CUK Seminar:  V.T.Hoang, Review on "Namkyeong Lee, et al. Relational Self-...
NS-CUK Seminar: V.T.Hoang, Review on "Namkyeong Lee, et al. Relational Self-...
 
Colfax-Winograd-Summary _final (1)
Colfax-Winograd-Summary _final (1)Colfax-Winograd-Summary _final (1)
Colfax-Winograd-Summary _final (1)
 
VJAI Paper Reading#3-KDD2019-ClusterGCN
VJAI Paper Reading#3-KDD2019-ClusterGCNVJAI Paper Reading#3-KDD2019-ClusterGCN
VJAI Paper Reading#3-KDD2019-ClusterGCN
 
Hidden Layer Leraning Vector Quantizatio
Hidden Layer Leraning Vector Quantizatio Hidden Layer Leraning Vector Quantizatio
Hidden Layer Leraning Vector Quantizatio
 
COMPARATIVE PERFORMANCE ANALYSIS OF RNSC AND MCL ALGORITHMS ON POWER-LAW DIST...
COMPARATIVE PERFORMANCE ANALYSIS OF RNSC AND MCL ALGORITHMS ON POWER-LAW DIST...COMPARATIVE PERFORMANCE ANALYSIS OF RNSC AND MCL ALGORITHMS ON POWER-LAW DIST...
COMPARATIVE PERFORMANCE ANALYSIS OF RNSC AND MCL ALGORITHMS ON POWER-LAW DIST...
 
J. Park, H. Shim, AAAI 2022, MLILAB, KAISTAI
J. Park, H. Shim, AAAI 2022, MLILAB, KAISTAIJ. Park, H. Shim, AAAI 2022, MLILAB, KAISTAI
J. Park, H. Shim, AAAI 2022, MLILAB, KAISTAI
 
NS-CUK Seminar: V.T.Hoang, Review on "Relative Molecule Self-Attention Transf...
NS-CUK Seminar: V.T.Hoang, Review on "Relative Molecule Self-Attention Transf...NS-CUK Seminar: V.T.Hoang, Review on "Relative Molecule Self-Attention Transf...
NS-CUK Seminar: V.T.Hoang, Review on "Relative Molecule Self-Attention Transf...
 
Study on the application of genetic algorithms in the optimization of wireles...
Study on the application of genetic algorithms in the optimization of wireles...Study on the application of genetic algorithms in the optimization of wireles...
Study on the application of genetic algorithms in the optimization of wireles...
 

More from thanhdowork

More from thanhdowork (20)

240513_Thanh_LabSeminar[Learning and Aggregating Lane Graphs for Urban Automa...
240513_Thanh_LabSeminar[Learning and Aggregating Lane Graphs for Urban Automa...240513_Thanh_LabSeminar[Learning and Aggregating Lane Graphs for Urban Automa...
240513_Thanh_LabSeminar[Learning and Aggregating Lane Graphs for Urban Automa...
 
240513_Thuy_Labseminar[Universal Prompt Tuning for Graph Neural Networks].pptx
240513_Thuy_Labseminar[Universal Prompt Tuning for Graph Neural Networks].pptx240513_Thuy_Labseminar[Universal Prompt Tuning for Graph Neural Networks].pptx
240513_Thuy_Labseminar[Universal Prompt Tuning for Graph Neural Networks].pptx
 
[20240513_LabSeminar_Huy]GraphFewShort_Transfer.pptx
[20240513_LabSeminar_Huy]GraphFewShort_Transfer.pptx[20240513_LabSeminar_Huy]GraphFewShort_Transfer.pptx
[20240513_LabSeminar_Huy]GraphFewShort_Transfer.pptx
 
240506_JW_labseminar[Structural Deep Network Embedding].pptx
240506_JW_labseminar[Structural Deep Network Embedding].pptx240506_JW_labseminar[Structural Deep Network Embedding].pptx
240506_JW_labseminar[Structural Deep Network Embedding].pptx
 
[20240506_LabSeminar_Huy]Conditional Local Convolution for Spatio-Temporal Me...
[20240506_LabSeminar_Huy]Conditional Local Convolution for Spatio-Temporal Me...[20240506_LabSeminar_Huy]Conditional Local Convolution for Spatio-Temporal Me...
[20240506_LabSeminar_Huy]Conditional Local Convolution for Spatio-Temporal Me...
 
240506_Thanh_LabSeminar[ASG2Caption].pptx
240506_Thanh_LabSeminar[ASG2Caption].pptx240506_Thanh_LabSeminar[ASG2Caption].pptx
240506_Thanh_LabSeminar[ASG2Caption].pptx
 
240506_Thuy_Labseminar[GraphPrompt: Unifying Pre-Training and Downstream Task...
240506_Thuy_Labseminar[GraphPrompt: Unifying Pre-Training and Downstream Task...240506_Thuy_Labseminar[GraphPrompt: Unifying Pre-Training and Downstream Task...
240506_Thuy_Labseminar[GraphPrompt: Unifying Pre-Training and Downstream Task...
 
[20240429_LabSeminar_Huy]Spatio-Temporal Graph Neural Point Process for Traff...
[20240429_LabSeminar_Huy]Spatio-Temporal Graph Neural Point Process for Traff...[20240429_LabSeminar_Huy]Spatio-Temporal Graph Neural Point Process for Traff...
[20240429_LabSeminar_Huy]Spatio-Temporal Graph Neural Point Process for Traff...
 
240429_Thanh_LabSeminar[TranSG: Transformer-Based Skeleton Graph Prototype Co...
240429_Thanh_LabSeminar[TranSG: Transformer-Based Skeleton Graph Prototype Co...240429_Thanh_LabSeminar[TranSG: Transformer-Based Skeleton Graph Prototype Co...
240429_Thanh_LabSeminar[TranSG: Transformer-Based Skeleton Graph Prototype Co...
 
240429_Thuy_Labseminar[Simplifying and Empowering Transformers for Large-Grap...
240429_Thuy_Labseminar[Simplifying and Empowering Transformers for Large-Grap...240429_Thuy_Labseminar[Simplifying and Empowering Transformers for Large-Grap...
240429_Thuy_Labseminar[Simplifying and Empowering Transformers for Large-Grap...
 
240422_Thanh_LabSeminar[Dynamic Graph Enhanced Contrastive Learning for Chest...
240422_Thanh_LabSeminar[Dynamic Graph Enhanced Contrastive Learning for Chest...240422_Thanh_LabSeminar[Dynamic Graph Enhanced Contrastive Learning for Chest...
240422_Thanh_LabSeminar[Dynamic Graph Enhanced Contrastive Learning for Chest...
 
[20240422_LabSeminar_Huy]Taming_Effect.pptx
[20240422_LabSeminar_Huy]Taming_Effect.pptx[20240422_LabSeminar_Huy]Taming_Effect.pptx
[20240422_LabSeminar_Huy]Taming_Effect.pptx
 
240422_Thuy_Labseminar[Large Graph Property Prediction via Graph Segment Trai...
240422_Thuy_Labseminar[Large Graph Property Prediction via Graph Segment Trai...240422_Thuy_Labseminar[Large Graph Property Prediction via Graph Segment Trai...
240422_Thuy_Labseminar[Large Graph Property Prediction via Graph Segment Trai...
 
[20240415_LabSeminar_Huy]Deciphering Spatio-Temporal Graph Forecasting: A Cau...
[20240415_LabSeminar_Huy]Deciphering Spatio-Temporal Graph Forecasting: A Cau...[20240415_LabSeminar_Huy]Deciphering Spatio-Temporal Graph Forecasting: A Cau...
[20240415_LabSeminar_Huy]Deciphering Spatio-Temporal Graph Forecasting: A Cau...
 
240315_Thanh_LabSeminar[G-TAD: Sub-Graph Localization for Temporal Action Det...
240315_Thanh_LabSeminar[G-TAD: Sub-Graph Localization for Temporal Action Det...240315_Thanh_LabSeminar[G-TAD: Sub-Graph Localization for Temporal Action Det...
240315_Thanh_LabSeminar[G-TAD: Sub-Graph Localization for Temporal Action Det...
 
240415_Thuy_Labseminar[Simple and Asymmetric Graph Contrastive Learning witho...
240415_Thuy_Labseminar[Simple and Asymmetric Graph Contrastive Learning witho...240415_Thuy_Labseminar[Simple and Asymmetric Graph Contrastive Learning witho...
240415_Thuy_Labseminar[Simple and Asymmetric Graph Contrastive Learning witho...
 
240115_Attention Is All You Need (2017 NIPS).pptx
240115_Attention Is All You Need (2017 NIPS).pptx240115_Attention Is All You Need (2017 NIPS).pptx
240115_Attention Is All You Need (2017 NIPS).pptx
 
240115_Thanh_LabSeminar[Don't walk, skip! online learning of multi-scale netw...
240115_Thanh_LabSeminar[Don't walk, skip! online learning of multi-scale netw...240115_Thanh_LabSeminar[Don't walk, skip! online learning of multi-scale netw...
240115_Thanh_LabSeminar[Don't walk, skip! online learning of multi-scale netw...
 
240122_Attention Is All You Need (2017 NIPS)2.pptx
240122_Attention Is All You Need (2017 NIPS)2.pptx240122_Attention Is All You Need (2017 NIPS)2.pptx
240122_Attention Is All You Need (2017 NIPS)2.pptx
 
240226_Thanh_LabSeminar[Structure-Aware Transformer for Graph Representation ...
240226_Thanh_LabSeminar[Structure-Aware Transformer for Graph Representation ...240226_Thanh_LabSeminar[Structure-Aware Transformer for Graph Representation ...
240226_Thanh_LabSeminar[Structure-Aware Transformer for Graph Representation ...
 

Recently uploaded

Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPSSpellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
AnaAcapella
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
heathfieldcps1
 
Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...
Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...
Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...
EADTU
 

Recently uploaded (20)

HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptxHMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
 
Introduction to TechSoup’s Digital Marketing Services and Use Cases
Introduction to TechSoup’s Digital Marketing  Services and Use CasesIntroduction to TechSoup’s Digital Marketing  Services and Use Cases
Introduction to TechSoup’s Digital Marketing Services and Use Cases
 
Model Attribute _rec_name in the Odoo 17
Model Attribute _rec_name in the Odoo 17Model Attribute _rec_name in the Odoo 17
Model Attribute _rec_name in the Odoo 17
 
How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17
 
Graduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - EnglishGraduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - English
 
On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan Fellows
 
Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPSSpellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
 
Jamworks pilot and AI at Jisc (20/03/2024)
Jamworks pilot and AI at Jisc (20/03/2024)Jamworks pilot and AI at Jisc (20/03/2024)
Jamworks pilot and AI at Jisc (20/03/2024)
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 
Play hard learn harder: The Serious Business of Play
Play hard learn harder:  The Serious Business of PlayPlay hard learn harder:  The Serious Business of Play
Play hard learn harder: The Serious Business of Play
 
Our Environment Class 10 Science Notes pdf
Our Environment Class 10 Science Notes pdfOur Environment Class 10 Science Notes pdf
Our Environment Class 10 Science Notes pdf
 
How to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POSHow to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POS
 
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptxCOMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
 
PANDITA RAMABAI- Indian political thought GENDER.pptx
PANDITA RAMABAI- Indian political thought GENDER.pptxPANDITA RAMABAI- Indian political thought GENDER.pptx
PANDITA RAMABAI- Indian political thought GENDER.pptx
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptxHMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
 
FICTIONAL SALESMAN/SALESMAN SNSW 2024.pdf
FICTIONAL SALESMAN/SALESMAN SNSW 2024.pdfFICTIONAL SALESMAN/SALESMAN SNSW 2024.pdf
FICTIONAL SALESMAN/SALESMAN SNSW 2024.pdf
 
Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...
Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...
Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...
 
UGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdf
UGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdfUGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdf
UGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdf
 
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
 

240304_Thuy_Labseminar[SimGRACE: A Simple Framework for Graph Contrastive Learning without Data Augmentation].pptx

  • 1. Van Thuy Hoang Network Science Lab Dept. of Artificial Intelligence The Catholic University of Korea E-mail: hoangvanthuy90@gmail.com 240302 Jun Xia et.al., WWW 22
  • 2. 2 Graph Convolutional Networks (GCNs)  Generate node embeddings based on local network neighborhoods  Nodes have embeddings at each layer, repeating combine messages from their neighbor using neural networks
  • 3. 3 Problems  Most existing GNNs are trained in a supervised manner and it is often resource- and time-intensive to collect abundant true-labeled data  Graph self-supervised learning that learns representations from unlabeled graphs GCA, GRACE models GRAph Contrastive rEpresentation learning (GRACE) model, ICML 20
  • 4. 4 Problems  However, these augmentations are not suitable for all scenarios because the structural information and semantics of the graphs varies significantly across domains  It is difficult to preserve semantics well during augmentations in view of the diverse nature of graph data.  Edge perturbation benefits social networks but hurt some biochemical molecules in GCL  Dropping a carbon atom in the phenyl ring will alter the aromatic system and result in an alkene chain, which will drastically change the molecular properties
  • 5. 5 Solutions  Can we emancipate graph contrastive learning from tedious manual trial-and- errors, cumbersome search or expensive domain knowledge?Solutions Solutions
  • 6. 6 Illustration of SimGRACE  Instead of augmenting the graph data, we feed the original graph G into a GNN encoder 𝑓 (·; 𝜽) and its perturbed version 𝑓 (·; 𝜽 ′ ).  After passing a shared projection head 𝑔(·), we maximize the agreement between representations 𝒛_𝑖 and 𝒛_𝑗 via a contrastive loss
  • 7. 7 SimGRACE differs from others  (1) SimGRACE perturbs the encoder with a random Guassian noise instead of momentum updating  (2) SimGRACE does not require data augmentation while BGRL and MERIT take it as prerequisite.  (3) SimGRACE focuses on graph-level representation learning while BGRL and MERIT only work in node-level tasks.
  • 8. 8 SimGRACE (1) Encoder perturbation. The perturbation term which samples from Gaussian distribution with zero mean and variance
  • 9. 9 (2) Projection head  a two-layer perceptron (MLP) to obtain 𝑧 and 𝑧 ′
  • 10. 10 (3) Contrastive loss  The normalized temperature-scaled cross entropy loss (NT-Xent) as other works  Sim(): the cosine similarity function
  • 11. 11 Why can SimGRACE work well  Two key properties related to contrastive learning: alignment and uniformity and then propose two metrics to measure the quality of representations obtained via contrastive learning  Alignment and uniformity and then propose two metrics to measure the quality of representations obtained via contrastive learning
  • 12. 12 AT-SimGRACE  To utilize Adversarial Training (AT) to improve the adversarial robustness of SimGRACE in a principled way.  Generally, AT directly incorporates adversarial examples into the training process to solve the following optimization probl x′_𝑖 is the adversarial example within the 𝜖-ball (bounded by an 𝐿𝑝 –norm)
  • 13. 13 Experimental Setup  For unsupervised and semi-supervised learning:  benchmark TUDataset, including graph data for various social networks and biochemical molecules  For transfer learning:  pre-training on ZINC-2M and PPI-306K  finetune the model with various datasets including PPI, BBBP, ToxCast and SIDER.
  • 15. 15 EXPERIMENTS  Results for transfer learning setting.
  • 16. 16 EXPERIMENTS  Hyper-parameters sensitivity analysis Without perturbation, SimGRACE simply compares two original samples as a negative pair while the positive pair loss becomes zero, leading to homogeneously pushes all graph representations away from each other, which is non-intuitive to justify.
  • 17. 17 Conclusion  A simple framework (SimGRACE) for graph contrastive learning.  SimGRACE can outperform or match the stateof-the-art competitors on multiple graph datasets of various scales and types  SimGRACE emancipate graph contrastive learning from tedious manual tuning, cumbersome search or expensive domain knowledge.