SlideShare a Scribd company logo
1 of 14
A survey on methods and
applications of meta-learning with
GNNs
Paper by Debmalya Mandal, Sourav Medya, Brian Uzzi, Charu Aggarwal
Presented by- Shreya Goyal
Image from Hacker Noon
Meta-Learning:
The subfield of Deep learning and an exciting area
of research as it deals with the problem of having
very few samples to train the model. It works on
the essence of learning to learn to contemplate the
model which can be designed with very few
samples.
GNNs (Graph neural networks):
● Generalization of Deep neural networks on graph data is termed as GNN.
● It has been used in various domains to solve complicated problems
having graph-structured data.
● For example, in drug discovery, the goal is to find the group of molecules
that are likely to form a drug where input molecules are represented in a
graph structure.
● In the recommender system, the motive is to find the link between users
and items where these are represented as nodes of graph data.
Meta-learning for GNNs:
Despite recent success, GNN has its drawbacks. One of them is to apply GNNs on the problems
having very few samples to train the model. Problems having very large graph dataset, sometimes
have limited number of samples. Moreover, like in the recommender system, it needs to handle
diverse situations in real life and adapt to them with very limited samples.
Recently, meta-learning has unfolded the problem of limited samples in deep learning fields like
natural language processing, robotics, and health care. Meta-learning with GNN can be the spin
for the GNNs. Recently in this direction, several meta-learning methods to train GNNs have been
proposed for various applications. The main challenge in applying meta-learning to graph-
structured data is to determine the type of representation that is shared across tasks and devise
an effective training strategy.
Node embedding
The motivation for node embeddings lies in the possibility to capture characteristics of
the nodes of the graph so that any downstream application can directly work with these
representations, without considering the original graph. This problem is often
challenging because there are many nodes with very few connections.
Liu et al. [Liu+20] address this issue by applying meta-learning to the problem of node
embedding of graphs. They set up a regression problem with a common prior to
learning the node embeddings. Here, the training set for this problem is defined by the
higher degree nodes (more no of neighbors) having better accuracy. The testing set is
defined by lower degree nodes having only a few neighbors. To learn the
representation of a testing set, this problem is formulated as a meta-testing problem
and the common prior is adapted with a small number of samples for learning the
embeddings of such nodes.
Node classification
The goal of node classification is to find the missing labels of nodes of a partially
labeled graph. Examples of node classification problems are document categorization
and protein classification. These problems have received significant attention in recent
years. The obstacle is many classes are novel i.e., they have very few labeled nodes.
Due to the scarcity of the lack of samples, it is suitable to apply meta-learning
techniques in this problem.
Zhou et al. [Zho+19] apply a meta-learning approach for node classification using a
transferable technique. There are some shared common data between nodes. Shared
data has been used from the classes having many labeled examples and then in meta
testing, the same data is used to classify nodes with few labeled samples.
Link prediction
It is the problem of the existence of a link between two nodes in a network. Meta-learning is useful
for learning new relationships via edges in multi-relational graphs. An edge is defined as a triplet of
two nodes and a relation. The goal of link prediction in multi-relational graphs is to predict new
triples given one endpoint of a relation r with observing a few triples about r. This problem is
challenging because a limited number of triplet samples are given for a particular relation r.
Multi-relational graphs are even more difficult to manage with their dynamic nature (addition of new
nodes) over time and the learning is even more difficult when these newly added nodes have only
a few links among them. Baek et al. [BLH20] introduced a link prediction technique, where they
predict the links between the seen and unseen nodes as well as between the unseen nodes. The
main intention is to randomly split the entitled in a given graph into a meta training set and meta
testing set. Training set consists of simulated unseen entities and the testing set consists of real
unseen entities.
Node/Edge level shared representation
Shared representations at node/edge level mean for different tasks, nodes or edges are common in a given input
graph. Huang et al. [HZ20] consider the node classification problem where the input graphs, as well as the
labels, can be different across tasks.
Node/Edge level shared representation
Here, d(u,v) is the distance of the shortest path between nodes u and v.
Considered the above metric to construct a subgraph because the influence of a
node v on u decreases exponentially as the shortest path distance between them
increases. Then to learn the embedding of node u, feed Su to the GCN. Once we
have embedding for nodes, we can learn any function that maps the encoding to
class labels. They have used MAML (Model agnostic meta-learning) to learn this
function with very few samples on a new task, enjoying the benefits of local
shared representations in node classification.
Graph level shared representation
Shared representations at graph level mean for different tasks, the whole graph is a
common/shared part among tasks. A canonical application of this representation is the graph
classification problem, where the goal is to classify a given graph as one of the classes. Graph
classification requires a large number of samples for high-quality prediction. In real-world
problems, a limited number of samples are there for a given label. This problem can be handled by
meta-learning.
Graph level shared representation
Chauhan et al. [CNK20] proposed a few-shot graph classification based on graph
spectral measures. In particular, they train a feature-extractor Fθ to extract features
from the graphs in meta-training. For classification, they use two units Csup to predict
the super-class probability of a graph, and CGAT, a graph attention network to predict
the graph class label. During the meta-test phase, the weights of the networks Fθ and
Csup are fixed, and the network CGAT is retrained on the new test classes. As the
feature extractor Fθ is the common shared structure and is not retrained on the test
tasks, this approach requires few samples from new classes.
Conclusion
This survey paper has provided a comprehensive review of
works that are a combination of graph neural networks (GNNs)
and meta-learning. They have also provided a thorough review,
summary of methods, and applications in these categories. The
application of meta-learning to GNNs is a growing and exciting
field and many graph problems will benefit immensely from the
combination of the two approaches.
References
● https://arxiv.org/pdf/2103.00137.pdf
● https://dl.acm.org/doi/10.1145/3340531.3411910
● https://arxiv.org/pdf/1905.09718.pdf
● https://arxiv.org/pdf/2006.06648.pdf
● https://arxiv.org/pdf/2006.07889.pdf
● https://openreview.net/attachment?id=Bkeeca4Kvr&name=original_pdf
● https://arxiv.org/pdf/2003.08246v1.pdf
● https://arxiv.org/pdf/1609.02907.pdf

More Related Content

What's hot

A Low Rank Mechanism to Detect and Achieve Partially Completed Image Tags
A Low Rank Mechanism to Detect and Achieve Partially Completed Image TagsA Low Rank Mechanism to Detect and Achieve Partially Completed Image Tags
A Low Rank Mechanism to Detect and Achieve Partially Completed Image TagsIRJET Journal
 
Advanced machine learning for metabolite identification
Advanced machine learning for metabolite identificationAdvanced machine learning for metabolite identification
Advanced machine learning for metabolite identificationDai-Hai Nguyen
 
A Kernel Approach for Semi-Supervised Clustering Framework for High Dimension...
A Kernel Approach for Semi-Supervised Clustering Framework for High Dimension...A Kernel Approach for Semi-Supervised Clustering Framework for High Dimension...
A Kernel Approach for Semi-Supervised Clustering Framework for High Dimension...IJCSIS Research Publications
 
Efficient Image Retrieval by Multi-view Alignment Technique with Non Negative...
Efficient Image Retrieval by Multi-view Alignment Technique with Non Negative...Efficient Image Retrieval by Multi-view Alignment Technique with Non Negative...
Efficient Image Retrieval by Multi-view Alignment Technique with Non Negative...RSIS International
 
Learning to rank image tags with limited training examples
Learning to rank image tags with limited training examplesLearning to rank image tags with limited training examples
Learning to rank image tags with limited training examplesCloudTechnologies
 
Iee egold2010 presentazione_finale_veracini
Iee egold2010 presentazione_finale_veraciniIee egold2010 presentazione_finale_veracini
Iee egold2010 presentazione_finale_veracinigrssieee
 
Improved wolf algorithm on document images detection using optimum mean techn...
Improved wolf algorithm on document images detection using optimum mean techn...Improved wolf algorithm on document images detection using optimum mean techn...
Improved wolf algorithm on document images detection using optimum mean techn...journalBEEI
 
Textual Data Partitioning with Relationship and Discriminative Analysis
Textual Data Partitioning with Relationship and Discriminative AnalysisTextual Data Partitioning with Relationship and Discriminative Analysis
Textual Data Partitioning with Relationship and Discriminative AnalysisEditor IJMTER
 
M.E Computer Science Image Processing Projects
M.E Computer Science Image Processing ProjectsM.E Computer Science Image Processing Projects
M.E Computer Science Image Processing ProjectsVijay Karan
 
Clustering in Aggregated User Profiles across Multiple Social Networks
Clustering in Aggregated User Profiles across Multiple Social Networks Clustering in Aggregated User Profiles across Multiple Social Networks
Clustering in Aggregated User Profiles across Multiple Social Networks IJECEIAES
 
M.Phil Computer Science Image Processing Projects
M.Phil Computer Science Image Processing ProjectsM.Phil Computer Science Image Processing Projects
M.Phil Computer Science Image Processing ProjectsVijay Karan
 
M.Phil Computer Science Image Processing Projects
M.Phil Computer Science Image Processing ProjectsM.Phil Computer Science Image Processing Projects
M.Phil Computer Science Image Processing ProjectsVijay Karan
 
CONTEXT-AWARE CLUSTERING USING GLOVE AND K-MEANS
CONTEXT-AWARE CLUSTERING USING GLOVE AND K-MEANSCONTEXT-AWARE CLUSTERING USING GLOVE AND K-MEANS
CONTEXT-AWARE CLUSTERING USING GLOVE AND K-MEANSijseajournal
 
An efficient-classification-model-for-unstructured-text-document
An efficient-classification-model-for-unstructured-text-documentAn efficient-classification-model-for-unstructured-text-document
An efficient-classification-model-for-unstructured-text-documentSaleihGero
 
IRJET- Predicting Customers Churn in Telecom Industry using Centroid Oversamp...
IRJET- Predicting Customers Churn in Telecom Industry using Centroid Oversamp...IRJET- Predicting Customers Churn in Telecom Industry using Centroid Oversamp...
IRJET- Predicting Customers Churn in Telecom Industry using Centroid Oversamp...IRJET Journal
 
A hybrid naïve Bayes based on similarity measure to optimize the mixed-data c...
A hybrid naïve Bayes based on similarity measure to optimize the mixed-data c...A hybrid naïve Bayes based on similarity measure to optimize the mixed-data c...
A hybrid naïve Bayes based on similarity measure to optimize the mixed-data c...TELKOMNIKA JOURNAL
 
A Combined Approach for Feature Subset Selection and Size Reduction for High ...
A Combined Approach for Feature Subset Selection and Size Reduction for High ...A Combined Approach for Feature Subset Selection and Size Reduction for High ...
A Combined Approach for Feature Subset Selection and Size Reduction for High ...IJERA Editor
 

What's hot (20)

A Low Rank Mechanism to Detect and Achieve Partially Completed Image Tags
A Low Rank Mechanism to Detect and Achieve Partially Completed Image TagsA Low Rank Mechanism to Detect and Achieve Partially Completed Image Tags
A Low Rank Mechanism to Detect and Achieve Partially Completed Image Tags
 
Advanced machine learning for metabolite identification
Advanced machine learning for metabolite identificationAdvanced machine learning for metabolite identification
Advanced machine learning for metabolite identification
 
A Kernel Approach for Semi-Supervised Clustering Framework for High Dimension...
A Kernel Approach for Semi-Supervised Clustering Framework for High Dimension...A Kernel Approach for Semi-Supervised Clustering Framework for High Dimension...
A Kernel Approach for Semi-Supervised Clustering Framework for High Dimension...
 
Ppt manqing
Ppt manqingPpt manqing
Ppt manqing
 
Efficient Image Retrieval by Multi-view Alignment Technique with Non Negative...
Efficient Image Retrieval by Multi-view Alignment Technique with Non Negative...Efficient Image Retrieval by Multi-view Alignment Technique with Non Negative...
Efficient Image Retrieval by Multi-view Alignment Technique with Non Negative...
 
Learning to rank image tags with limited training examples
Learning to rank image tags with limited training examplesLearning to rank image tags with limited training examples
Learning to rank image tags with limited training examples
 
Iee egold2010 presentazione_finale_veracini
Iee egold2010 presentazione_finale_veraciniIee egold2010 presentazione_finale_veracini
Iee egold2010 presentazione_finale_veracini
 
Improved wolf algorithm on document images detection using optimum mean techn...
Improved wolf algorithm on document images detection using optimum mean techn...Improved wolf algorithm on document images detection using optimum mean techn...
Improved wolf algorithm on document images detection using optimum mean techn...
 
Textual Data Partitioning with Relationship and Discriminative Analysis
Textual Data Partitioning with Relationship and Discriminative AnalysisTextual Data Partitioning with Relationship and Discriminative Analysis
Textual Data Partitioning with Relationship and Discriminative Analysis
 
M.E Computer Science Image Processing Projects
M.E Computer Science Image Processing ProjectsM.E Computer Science Image Processing Projects
M.E Computer Science Image Processing Projects
 
graph_embeddings
graph_embeddingsgraph_embeddings
graph_embeddings
 
Clustering in Aggregated User Profiles across Multiple Social Networks
Clustering in Aggregated User Profiles across Multiple Social Networks Clustering in Aggregated User Profiles across Multiple Social Networks
Clustering in Aggregated User Profiles across Multiple Social Networks
 
M.Phil Computer Science Image Processing Projects
M.Phil Computer Science Image Processing ProjectsM.Phil Computer Science Image Processing Projects
M.Phil Computer Science Image Processing Projects
 
M.Phil Computer Science Image Processing Projects
M.Phil Computer Science Image Processing ProjectsM.Phil Computer Science Image Processing Projects
M.Phil Computer Science Image Processing Projects
 
F017533540
F017533540F017533540
F017533540
 
CONTEXT-AWARE CLUSTERING USING GLOVE AND K-MEANS
CONTEXT-AWARE CLUSTERING USING GLOVE AND K-MEANSCONTEXT-AWARE CLUSTERING USING GLOVE AND K-MEANS
CONTEXT-AWARE CLUSTERING USING GLOVE AND K-MEANS
 
An efficient-classification-model-for-unstructured-text-document
An efficient-classification-model-for-unstructured-text-documentAn efficient-classification-model-for-unstructured-text-document
An efficient-classification-model-for-unstructured-text-document
 
IRJET- Predicting Customers Churn in Telecom Industry using Centroid Oversamp...
IRJET- Predicting Customers Churn in Telecom Industry using Centroid Oversamp...IRJET- Predicting Customers Churn in Telecom Industry using Centroid Oversamp...
IRJET- Predicting Customers Churn in Telecom Industry using Centroid Oversamp...
 
A hybrid naïve Bayes based on similarity measure to optimize the mixed-data c...
A hybrid naïve Bayes based on similarity measure to optimize the mixed-data c...A hybrid naïve Bayes based on similarity measure to optimize the mixed-data c...
A hybrid naïve Bayes based on similarity measure to optimize the mixed-data c...
 
A Combined Approach for Feature Subset Selection and Size Reduction for High ...
A Combined Approach for Feature Subset Selection and Size Reduction for High ...A Combined Approach for Feature Subset Selection and Size Reduction for High ...
A Combined Approach for Feature Subset Selection and Size Reduction for High ...
 

Similar to A survey on methods and applications of meta-learning with GNNs

PaperReview_ “Few-shot Graph Classification with Contrastive Loss and Meta-cl...
PaperReview_ “Few-shot Graph Classification with Contrastive Loss and Meta-cl...PaperReview_ “Few-shot Graph Classification with Contrastive Loss and Meta-cl...
PaperReview_ “Few-shot Graph Classification with Contrastive Loss and Meta-cl...AkankshaRawat53
 
Node classification with graph neural network based centrality measures and f...
Node classification with graph neural network based centrality measures and f...Node classification with graph neural network based centrality measures and f...
Node classification with graph neural network based centrality measures and f...IJECEIAES
 
NS - CUK Seminar: V.T.Hoang, Review on "Long Range Graph Benchmark.", NeurIPS...
NS - CUK Seminar: V.T.Hoang, Review on "Long Range Graph Benchmark.", NeurIPS...NS - CUK Seminar: V.T.Hoang, Review on "Long Range Graph Benchmark.", NeurIPS...
NS - CUK Seminar: V.T.Hoang, Review on "Long Range Graph Benchmark.", NeurIPS...ssuser4b1f48
 
Noise-robust classification with hypergraph neural network
Noise-robust classification with hypergraph neural networkNoise-robust classification with hypergraph neural network
Noise-robust classification with hypergraph neural networknooriasukmaningtyas
 
Graph Neural Prompting with Large Language Models.pptx
Graph Neural Prompting with Large Language Models.pptxGraph Neural Prompting with Large Language Models.pptx
Graph Neural Prompting with Large Language Models.pptxssuser2624f71
 
CAUSALITY LEARNING WITH WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKS
CAUSALITY LEARNING WITH WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKSCAUSALITY LEARNING WITH WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKS
CAUSALITY LEARNING WITH WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKSgerogepatton
 
CAUSALITY LEARNING WITH WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKS
CAUSALITY LEARNING WITH WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKSCAUSALITY LEARNING WITH WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKS
CAUSALITY LEARNING WITH WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKSijaia
 
Web image annotation by diffusion maps manifold learning algorithm
Web image annotation by diffusion maps manifold learning algorithmWeb image annotation by diffusion maps manifold learning algorithm
Web image annotation by diffusion maps manifold learning algorithmijfcstjournal
 
MULTI-LEVEL FEATURE FUSION BASED TRANSFER LEARNING FOR PERSON RE-IDENTIFICATION
MULTI-LEVEL FEATURE FUSION BASED TRANSFER LEARNING FOR PERSON RE-IDENTIFICATIONMULTI-LEVEL FEATURE FUSION BASED TRANSFER LEARNING FOR PERSON RE-IDENTIFICATION
MULTI-LEVEL FEATURE FUSION BASED TRANSFER LEARNING FOR PERSON RE-IDENTIFICATIONijaia
 
ADVANCED SINGLE IMAGE RESOLUTION UPSURGING USING A GENERATIVE ADVERSARIAL NET...
ADVANCED SINGLE IMAGE RESOLUTION UPSURGING USING A GENERATIVE ADVERSARIAL NET...ADVANCED SINGLE IMAGE RESOLUTION UPSURGING USING A GENERATIVE ADVERSARIAL NET...
ADVANCED SINGLE IMAGE RESOLUTION UPSURGING USING A GENERATIVE ADVERSARIAL NET...sipij
 
Deep Graph Contrastive Representation Learning.pptx
Deep Graph Contrastive Representation Learning.pptxDeep Graph Contrastive Representation Learning.pptx
Deep Graph Contrastive Representation Learning.pptxssuser2624f71
 
An Efficient Clustering Method for Aggregation on Data Fragments
An Efficient Clustering Method for Aggregation on Data FragmentsAn Efficient Clustering Method for Aggregation on Data Fragments
An Efficient Clustering Method for Aggregation on Data FragmentsIJMER
 
IEEE 2015 Matlab Projects
IEEE 2015 Matlab ProjectsIEEE 2015 Matlab Projects
IEEE 2015 Matlab ProjectsVijay Karan
 
Bat-Cluster: A Bat Algorithm-based Automated Graph Clustering Approach
Bat-Cluster: A Bat Algorithm-based Automated Graph Clustering Approach Bat-Cluster: A Bat Algorithm-based Automated Graph Clustering Approach
Bat-Cluster: A Bat Algorithm-based Automated Graph Clustering Approach IJECEIAES
 
IEEE 2015 Matlab Projects
IEEE 2015 Matlab ProjectsIEEE 2015 Matlab Projects
IEEE 2015 Matlab ProjectsVijay Karan
 
Data clustering using kernel based
Data clustering using kernel basedData clustering using kernel based
Data clustering using kernel basedIJITCA Journal
 
Expandable bayesian
Expandable bayesianExpandable bayesian
Expandable bayesianAhmad Amri
 
A new approachto image classification based on adeep multiclass AdaBoosting e...
A new approachto image classification based on adeep multiclass AdaBoosting e...A new approachto image classification based on adeep multiclass AdaBoosting e...
A new approachto image classification based on adeep multiclass AdaBoosting e...IJECEIAES
 
Finding Relationships between the Our-NIR Cluster Results
Finding Relationships between the Our-NIR Cluster ResultsFinding Relationships between the Our-NIR Cluster Results
Finding Relationships between the Our-NIR Cluster ResultsCSCJournals
 
An Ensemble Approach To Improve Homomorphic Encrypted Data Classification Per...
An Ensemble Approach To Improve Homomorphic Encrypted Data Classification Per...An Ensemble Approach To Improve Homomorphic Encrypted Data Classification Per...
An Ensemble Approach To Improve Homomorphic Encrypted Data Classification Per...IJCI JOURNAL
 

Similar to A survey on methods and applications of meta-learning with GNNs (20)

PaperReview_ “Few-shot Graph Classification with Contrastive Loss and Meta-cl...
PaperReview_ “Few-shot Graph Classification with Contrastive Loss and Meta-cl...PaperReview_ “Few-shot Graph Classification with Contrastive Loss and Meta-cl...
PaperReview_ “Few-shot Graph Classification with Contrastive Loss and Meta-cl...
 
Node classification with graph neural network based centrality measures and f...
Node classification with graph neural network based centrality measures and f...Node classification with graph neural network based centrality measures and f...
Node classification with graph neural network based centrality measures and f...
 
NS - CUK Seminar: V.T.Hoang, Review on "Long Range Graph Benchmark.", NeurIPS...
NS - CUK Seminar: V.T.Hoang, Review on "Long Range Graph Benchmark.", NeurIPS...NS - CUK Seminar: V.T.Hoang, Review on "Long Range Graph Benchmark.", NeurIPS...
NS - CUK Seminar: V.T.Hoang, Review on "Long Range Graph Benchmark.", NeurIPS...
 
Noise-robust classification with hypergraph neural network
Noise-robust classification with hypergraph neural networkNoise-robust classification with hypergraph neural network
Noise-robust classification with hypergraph neural network
 
Graph Neural Prompting with Large Language Models.pptx
Graph Neural Prompting with Large Language Models.pptxGraph Neural Prompting with Large Language Models.pptx
Graph Neural Prompting with Large Language Models.pptx
 
CAUSALITY LEARNING WITH WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKS
CAUSALITY LEARNING WITH WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKSCAUSALITY LEARNING WITH WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKS
CAUSALITY LEARNING WITH WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKS
 
CAUSALITY LEARNING WITH WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKS
CAUSALITY LEARNING WITH WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKSCAUSALITY LEARNING WITH WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKS
CAUSALITY LEARNING WITH WASSERSTEIN GENERATIVE ADVERSARIAL NETWORKS
 
Web image annotation by diffusion maps manifold learning algorithm
Web image annotation by diffusion maps manifold learning algorithmWeb image annotation by diffusion maps manifold learning algorithm
Web image annotation by diffusion maps manifold learning algorithm
 
MULTI-LEVEL FEATURE FUSION BASED TRANSFER LEARNING FOR PERSON RE-IDENTIFICATION
MULTI-LEVEL FEATURE FUSION BASED TRANSFER LEARNING FOR PERSON RE-IDENTIFICATIONMULTI-LEVEL FEATURE FUSION BASED TRANSFER LEARNING FOR PERSON RE-IDENTIFICATION
MULTI-LEVEL FEATURE FUSION BASED TRANSFER LEARNING FOR PERSON RE-IDENTIFICATION
 
ADVANCED SINGLE IMAGE RESOLUTION UPSURGING USING A GENERATIVE ADVERSARIAL NET...
ADVANCED SINGLE IMAGE RESOLUTION UPSURGING USING A GENERATIVE ADVERSARIAL NET...ADVANCED SINGLE IMAGE RESOLUTION UPSURGING USING A GENERATIVE ADVERSARIAL NET...
ADVANCED SINGLE IMAGE RESOLUTION UPSURGING USING A GENERATIVE ADVERSARIAL NET...
 
Deep Graph Contrastive Representation Learning.pptx
Deep Graph Contrastive Representation Learning.pptxDeep Graph Contrastive Representation Learning.pptx
Deep Graph Contrastive Representation Learning.pptx
 
An Efficient Clustering Method for Aggregation on Data Fragments
An Efficient Clustering Method for Aggregation on Data FragmentsAn Efficient Clustering Method for Aggregation on Data Fragments
An Efficient Clustering Method for Aggregation on Data Fragments
 
IEEE 2015 Matlab Projects
IEEE 2015 Matlab ProjectsIEEE 2015 Matlab Projects
IEEE 2015 Matlab Projects
 
Bat-Cluster: A Bat Algorithm-based Automated Graph Clustering Approach
Bat-Cluster: A Bat Algorithm-based Automated Graph Clustering Approach Bat-Cluster: A Bat Algorithm-based Automated Graph Clustering Approach
Bat-Cluster: A Bat Algorithm-based Automated Graph Clustering Approach
 
IEEE 2015 Matlab Projects
IEEE 2015 Matlab ProjectsIEEE 2015 Matlab Projects
IEEE 2015 Matlab Projects
 
Data clustering using kernel based
Data clustering using kernel basedData clustering using kernel based
Data clustering using kernel based
 
Expandable bayesian
Expandable bayesianExpandable bayesian
Expandable bayesian
 
A new approachto image classification based on adeep multiclass AdaBoosting e...
A new approachto image classification based on adeep multiclass AdaBoosting e...A new approachto image classification based on adeep multiclass AdaBoosting e...
A new approachto image classification based on adeep multiclass AdaBoosting e...
 
Finding Relationships between the Our-NIR Cluster Results
Finding Relationships between the Our-NIR Cluster ResultsFinding Relationships between the Our-NIR Cluster Results
Finding Relationships between the Our-NIR Cluster Results
 
An Ensemble Approach To Improve Homomorphic Encrypted Data Classification Per...
An Ensemble Approach To Improve Homomorphic Encrypted Data Classification Per...An Ensemble Approach To Improve Homomorphic Encrypted Data Classification Per...
An Ensemble Approach To Improve Homomorphic Encrypted Data Classification Per...
 

Recently uploaded

Isotopic evidence of long-lived volcanism on Io
Isotopic evidence of long-lived volcanism on IoIsotopic evidence of long-lived volcanism on Io
Isotopic evidence of long-lived volcanism on IoSérgio Sacani
 
All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...
All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...
All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...Sérgio Sacani
 
Biopesticide (2).pptx .This slides helps to know the different types of biop...
Biopesticide (2).pptx  .This slides helps to know the different types of biop...Biopesticide (2).pptx  .This slides helps to know the different types of biop...
Biopesticide (2).pptx .This slides helps to know the different types of biop...RohitNehra6
 
Labelling Requirements and Label Claims for Dietary Supplements and Recommend...
Labelling Requirements and Label Claims for Dietary Supplements and Recommend...Labelling Requirements and Label Claims for Dietary Supplements and Recommend...
Labelling Requirements and Label Claims for Dietary Supplements and Recommend...Lokesh Kothari
 
A relative description on Sonoporation.pdf
A relative description on Sonoporation.pdfA relative description on Sonoporation.pdf
A relative description on Sonoporation.pdfnehabiju2046
 
Is RISC-V ready for HPC workload? Maybe?
Is RISC-V ready for HPC workload? Maybe?Is RISC-V ready for HPC workload? Maybe?
Is RISC-V ready for HPC workload? Maybe?Patrick Diehl
 
Pests of cotton_Sucking_Pests_Dr.UPR.pdf
Pests of cotton_Sucking_Pests_Dr.UPR.pdfPests of cotton_Sucking_Pests_Dr.UPR.pdf
Pests of cotton_Sucking_Pests_Dr.UPR.pdfPirithiRaju
 
Biological Classification BioHack (3).pdf
Biological Classification BioHack (3).pdfBiological Classification BioHack (3).pdf
Biological Classification BioHack (3).pdfmuntazimhurra
 
Unlocking the Potential: Deep dive into ocean of Ceramic Magnets.pptx
Unlocking  the Potential: Deep dive into ocean of Ceramic Magnets.pptxUnlocking  the Potential: Deep dive into ocean of Ceramic Magnets.pptx
Unlocking the Potential: Deep dive into ocean of Ceramic Magnets.pptxanandsmhk
 
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43bNightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43bSérgio Sacani
 
Bentham & Hooker's Classification. along with the merits and demerits of the ...
Bentham & Hooker's Classification. along with the merits and demerits of the ...Bentham & Hooker's Classification. along with the merits and demerits of the ...
Bentham & Hooker's Classification. along with the merits and demerits of the ...Nistarini College, Purulia (W.B) India
 
PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...
PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...
PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...Sérgio Sacani
 
Botany krishna series 2nd semester Only Mcq type questions
Botany krishna series 2nd semester Only Mcq type questionsBotany krishna series 2nd semester Only Mcq type questions
Botany krishna series 2nd semester Only Mcq type questionsSumit Kumar yadav
 
Presentation Vikram Lander by Vedansh Gupta.pptx
Presentation Vikram Lander by Vedansh Gupta.pptxPresentation Vikram Lander by Vedansh Gupta.pptx
Presentation Vikram Lander by Vedansh Gupta.pptxgindu3009
 
Formation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disksFormation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disksSérgio Sacani
 
Cultivation of KODO MILLET . made by Ghanshyam pptx
Cultivation of KODO MILLET . made by Ghanshyam pptxCultivation of KODO MILLET . made by Ghanshyam pptx
Cultivation of KODO MILLET . made by Ghanshyam pptxpradhanghanshyam7136
 
GFP in rDNA Technology (Biotechnology).pptx
GFP in rDNA Technology (Biotechnology).pptxGFP in rDNA Technology (Biotechnology).pptx
GFP in rDNA Technology (Biotechnology).pptxAleenaTreesaSaji
 
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptx
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptxSOLUBLE PATTERN RECOGNITION RECEPTORS.pptx
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptxkessiyaTpeter
 

Recently uploaded (20)

Isotopic evidence of long-lived volcanism on Io
Isotopic evidence of long-lived volcanism on IoIsotopic evidence of long-lived volcanism on Io
Isotopic evidence of long-lived volcanism on Io
 
The Philosophy of Science
The Philosophy of ScienceThe Philosophy of Science
The Philosophy of Science
 
All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...
All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...
All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...
 
Biopesticide (2).pptx .This slides helps to know the different types of biop...
Biopesticide (2).pptx  .This slides helps to know the different types of biop...Biopesticide (2).pptx  .This slides helps to know the different types of biop...
Biopesticide (2).pptx .This slides helps to know the different types of biop...
 
Labelling Requirements and Label Claims for Dietary Supplements and Recommend...
Labelling Requirements and Label Claims for Dietary Supplements and Recommend...Labelling Requirements and Label Claims for Dietary Supplements and Recommend...
Labelling Requirements and Label Claims for Dietary Supplements and Recommend...
 
A relative description on Sonoporation.pdf
A relative description on Sonoporation.pdfA relative description on Sonoporation.pdf
A relative description on Sonoporation.pdf
 
Is RISC-V ready for HPC workload? Maybe?
Is RISC-V ready for HPC workload? Maybe?Is RISC-V ready for HPC workload? Maybe?
Is RISC-V ready for HPC workload? Maybe?
 
Pests of cotton_Sucking_Pests_Dr.UPR.pdf
Pests of cotton_Sucking_Pests_Dr.UPR.pdfPests of cotton_Sucking_Pests_Dr.UPR.pdf
Pests of cotton_Sucking_Pests_Dr.UPR.pdf
 
Biological Classification BioHack (3).pdf
Biological Classification BioHack (3).pdfBiological Classification BioHack (3).pdf
Biological Classification BioHack (3).pdf
 
Unlocking the Potential: Deep dive into ocean of Ceramic Magnets.pptx
Unlocking  the Potential: Deep dive into ocean of Ceramic Magnets.pptxUnlocking  the Potential: Deep dive into ocean of Ceramic Magnets.pptx
Unlocking the Potential: Deep dive into ocean of Ceramic Magnets.pptx
 
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43bNightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
 
Bentham & Hooker's Classification. along with the merits and demerits of the ...
Bentham & Hooker's Classification. along with the merits and demerits of the ...Bentham & Hooker's Classification. along with the merits and demerits of the ...
Bentham & Hooker's Classification. along with the merits and demerits of the ...
 
PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...
PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...
PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...
 
Botany krishna series 2nd semester Only Mcq type questions
Botany krishna series 2nd semester Only Mcq type questionsBotany krishna series 2nd semester Only Mcq type questions
Botany krishna series 2nd semester Only Mcq type questions
 
Presentation Vikram Lander by Vedansh Gupta.pptx
Presentation Vikram Lander by Vedansh Gupta.pptxPresentation Vikram Lander by Vedansh Gupta.pptx
Presentation Vikram Lander by Vedansh Gupta.pptx
 
Formation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disksFormation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disks
 
Cultivation of KODO MILLET . made by Ghanshyam pptx
Cultivation of KODO MILLET . made by Ghanshyam pptxCultivation of KODO MILLET . made by Ghanshyam pptx
Cultivation of KODO MILLET . made by Ghanshyam pptx
 
GFP in rDNA Technology (Biotechnology).pptx
GFP in rDNA Technology (Biotechnology).pptxGFP in rDNA Technology (Biotechnology).pptx
GFP in rDNA Technology (Biotechnology).pptx
 
9953056974 Young Call Girls In Mahavir enclave Indian Quality Escort service
9953056974 Young Call Girls In Mahavir enclave Indian Quality Escort service9953056974 Young Call Girls In Mahavir enclave Indian Quality Escort service
9953056974 Young Call Girls In Mahavir enclave Indian Quality Escort service
 
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptx
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptxSOLUBLE PATTERN RECOGNITION RECEPTORS.pptx
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptx
 

A survey on methods and applications of meta-learning with GNNs

  • 1. A survey on methods and applications of meta-learning with GNNs Paper by Debmalya Mandal, Sourav Medya, Brian Uzzi, Charu Aggarwal Presented by- Shreya Goyal
  • 3. Meta-Learning: The subfield of Deep learning and an exciting area of research as it deals with the problem of having very few samples to train the model. It works on the essence of learning to learn to contemplate the model which can be designed with very few samples.
  • 4. GNNs (Graph neural networks): ● Generalization of Deep neural networks on graph data is termed as GNN. ● It has been used in various domains to solve complicated problems having graph-structured data. ● For example, in drug discovery, the goal is to find the group of molecules that are likely to form a drug where input molecules are represented in a graph structure. ● In the recommender system, the motive is to find the link between users and items where these are represented as nodes of graph data.
  • 5. Meta-learning for GNNs: Despite recent success, GNN has its drawbacks. One of them is to apply GNNs on the problems having very few samples to train the model. Problems having very large graph dataset, sometimes have limited number of samples. Moreover, like in the recommender system, it needs to handle diverse situations in real life and adapt to them with very limited samples. Recently, meta-learning has unfolded the problem of limited samples in deep learning fields like natural language processing, robotics, and health care. Meta-learning with GNN can be the spin for the GNNs. Recently in this direction, several meta-learning methods to train GNNs have been proposed for various applications. The main challenge in applying meta-learning to graph- structured data is to determine the type of representation that is shared across tasks and devise an effective training strategy.
  • 6. Node embedding The motivation for node embeddings lies in the possibility to capture characteristics of the nodes of the graph so that any downstream application can directly work with these representations, without considering the original graph. This problem is often challenging because there are many nodes with very few connections. Liu et al. [Liu+20] address this issue by applying meta-learning to the problem of node embedding of graphs. They set up a regression problem with a common prior to learning the node embeddings. Here, the training set for this problem is defined by the higher degree nodes (more no of neighbors) having better accuracy. The testing set is defined by lower degree nodes having only a few neighbors. To learn the representation of a testing set, this problem is formulated as a meta-testing problem and the common prior is adapted with a small number of samples for learning the embeddings of such nodes.
  • 7. Node classification The goal of node classification is to find the missing labels of nodes of a partially labeled graph. Examples of node classification problems are document categorization and protein classification. These problems have received significant attention in recent years. The obstacle is many classes are novel i.e., they have very few labeled nodes. Due to the scarcity of the lack of samples, it is suitable to apply meta-learning techniques in this problem. Zhou et al. [Zho+19] apply a meta-learning approach for node classification using a transferable technique. There are some shared common data between nodes. Shared data has been used from the classes having many labeled examples and then in meta testing, the same data is used to classify nodes with few labeled samples.
  • 8. Link prediction It is the problem of the existence of a link between two nodes in a network. Meta-learning is useful for learning new relationships via edges in multi-relational graphs. An edge is defined as a triplet of two nodes and a relation. The goal of link prediction in multi-relational graphs is to predict new triples given one endpoint of a relation r with observing a few triples about r. This problem is challenging because a limited number of triplet samples are given for a particular relation r. Multi-relational graphs are even more difficult to manage with their dynamic nature (addition of new nodes) over time and the learning is even more difficult when these newly added nodes have only a few links among them. Baek et al. [BLH20] introduced a link prediction technique, where they predict the links between the seen and unseen nodes as well as between the unseen nodes. The main intention is to randomly split the entitled in a given graph into a meta training set and meta testing set. Training set consists of simulated unseen entities and the testing set consists of real unseen entities.
  • 9. Node/Edge level shared representation Shared representations at node/edge level mean for different tasks, nodes or edges are common in a given input graph. Huang et al. [HZ20] consider the node classification problem where the input graphs, as well as the labels, can be different across tasks.
  • 10. Node/Edge level shared representation Here, d(u,v) is the distance of the shortest path between nodes u and v. Considered the above metric to construct a subgraph because the influence of a node v on u decreases exponentially as the shortest path distance between them increases. Then to learn the embedding of node u, feed Su to the GCN. Once we have embedding for nodes, we can learn any function that maps the encoding to class labels. They have used MAML (Model agnostic meta-learning) to learn this function with very few samples on a new task, enjoying the benefits of local shared representations in node classification.
  • 11. Graph level shared representation Shared representations at graph level mean for different tasks, the whole graph is a common/shared part among tasks. A canonical application of this representation is the graph classification problem, where the goal is to classify a given graph as one of the classes. Graph classification requires a large number of samples for high-quality prediction. In real-world problems, a limited number of samples are there for a given label. This problem can be handled by meta-learning.
  • 12. Graph level shared representation Chauhan et al. [CNK20] proposed a few-shot graph classification based on graph spectral measures. In particular, they train a feature-extractor Fθ to extract features from the graphs in meta-training. For classification, they use two units Csup to predict the super-class probability of a graph, and CGAT, a graph attention network to predict the graph class label. During the meta-test phase, the weights of the networks Fθ and Csup are fixed, and the network CGAT is retrained on the new test classes. As the feature extractor Fθ is the common shared structure and is not retrained on the test tasks, this approach requires few samples from new classes.
  • 13. Conclusion This survey paper has provided a comprehensive review of works that are a combination of graph neural networks (GNNs) and meta-learning. They have also provided a thorough review, summary of methods, and applications in these categories. The application of meta-learning to GNNs is a growing and exciting field and many graph problems will benefit immensely from the combination of the two approaches.
  • 14. References ● https://arxiv.org/pdf/2103.00137.pdf ● https://dl.acm.org/doi/10.1145/3340531.3411910 ● https://arxiv.org/pdf/1905.09718.pdf ● https://arxiv.org/pdf/2006.06648.pdf ● https://arxiv.org/pdf/2006.07889.pdf ● https://openreview.net/attachment?id=Bkeeca4Kvr&name=original_pdf ● https://arxiv.org/pdf/2003.08246v1.pdf ● https://arxiv.org/pdf/1609.02907.pdf