SlideShare a Scribd company logo
Hyo Eun Lee
Network Science Lab
Dept. of Biotechnology
The Catholic University of Korea
E-mail: gydnsml@gmail.com
2023.07.17
AAAI 2018
1
 Notations and Definitions
 Deep Hyper-Network Embedding
• Loss function
• Optimization
 Experiment
• Datasets and Parameter Settings
• Network Reconstruction
• Link Prediction and Classification
• Parameter Sensitivity
 Conclusion
2
1. Notations and Definitions
Definitions
• Definitions 1. : 𝐺 = 𝑉 , 𝐸
𝑉 = 𝑉𝑡 𝑡=1
𝑇
, 𝐸 = 𝐸𝑖 = 𝑣1, … , 𝑣𝑛𝑖 𝑛𝑖 ≥ 2
 If the number of nodes is 2 for each hyperedge, the hyper-network degenerates to a network.
 If T ≥ 2, the hypernetwork is defined as a heterogeneous hyper-network.
• Definitions 2. (first-order proximity)
: First-order proximity in hypernetworks measures the N-tuple similarity between nodes.
 For 𝑁 nodes, if a hyperedge exists, their N-tuplewise similarity is defined as 1.
 No information about subset
• Definitions 3. (second-order proximity)
: Node-based similarity shared by hyperedge pairs
 Consider neighboring structures to discover hidden patterns or relationships.
3
2. Deep Hyper-Network Embedding
Loss function
• If hyperedges exist between the 𝑁 vertices,
the N-tuple similarity(𝑆) must be large, otherwise it
must be small.
• Property 1.
 𝑖𝑓 𝑣1, 𝑣2, … , 𝑣𝑁 ∈ 𝐸, 𝑆 𝑋1, 𝑋2, … , 𝑋𝑁
𝑠. 𝑡. 𝑆 𝑙𝑎𝑟𝑔𝑒
 𝑖𝑓 𝑣1, 𝑣2, … , 𝑣𝑁 ∉ 𝐸, 𝑆 𝑋1, 𝑋2, … , 𝑋𝑁
𝑠. 𝑡. 𝑆 𝑙𝑎𝑟𝑔𝑒
+ In this paper, 𝑁 = 3
• Theorem 1.
𝐿𝑖𝑗𝑘 = 𝜎 𝑊
𝑎
2
∗ 𝑋𝑖
𝑎
+ 𝑊𝑏
2
∗ 𝑋𝑗
𝑏
+ 𝑊
𝑐
2
∗ 𝑋𝑘
𝑐
+ 𝑏2
𝑆𝑖𝑗𝑘 = 𝑆 𝑋𝑖
𝑎
, 𝑋𝑗
𝑏
, 𝑋𝑘
𝑐
= 𝜎 𝑊3
∗ 𝐿𝑖𝑗𝑘 + 𝑏3
Fill in this black
4
2. Deep Hyper-Network Embedding
Loss function
• Preserve neighborhood structure using an
autoencoder as a model
𝐴 = 𝐻𝐻𝑇
− 𝐷𝑣
• Encoder ) 𝑋𝑖 = 𝜎(𝑊1
∗ 𝐴𝑖 + 𝑏1
)
• Decoder ) 𝐴𝑖 = 𝜎(𝑊1
∗ 𝑋𝑖 + 𝑏1
)
𝑠𝑖𝑔𝑛 𝐴𝑖 ⨀ 𝐴𝑖 − 𝐴𝑖
𝐹
2
∴ ℒ2 =
𝑡
𝑠𝑖𝑔𝑛 𝐴𝑖
𝑡
⨀ 𝐴𝑖
𝑡
− 𝐴𝑖
𝑡
𝐹
2
∴ ℒ = ℒ1 + 𝛼ℒ2
Fill in this black
5
2. Deep Hyper-Network Embedding
Optimization
• Optimization : SGD
• Sampling multiple negatives based on a noisy
distribution and using them in a backpropagation
algorithm
• Out-of-sample extension
: Find the neighbor vector of the new arrival vertex
and feed it to the relevant autoencoder
𝑂(𝑑𝑣𝑑)
• Complexity analysis
: 𝑂 𝑛𝑑 + 𝑑𝑙 + 𝑙 𝑏𝐼
, Complexity is defined by the number of nodes
Fill in this black
6
3. Experiment
Datasets
• GPS network, a social network, a medicine network and a semantic network
Methods
• Conventional pairwise network embedding methods) DeepWalk, LINE, node2vec
→ use clique expansion
• SHE : homogeneous hypernetwork embeddings, tensor method is a direct way for preserving
high-order relationship
• HEBE : heterogeneous event data
7
Parameter Settings
• Represents the tuple similarity of hyperedges using the average or minimum of all pairwise
similarities for application to network reconstruction and link prediction in hypernetworks.
 DeepWalk and node2vec : window size – 10, walk length – 40, walks per vertex -10
 LINE : the number of negative samples – 5, 𝑝0 = 0.025
• DHNE : use one-layer autoencoder and fully connected layer
3. Experiment
8
Network Reconstruction
3. Experiment
9
Link Prediction
3. Experiment
10
Link Prediction
3. Experiment
11
Classification
3. Experiment
12
Parameter Sensitivity
3. Experiment
13
4. Contribution
Contribution
• Propose a DHNE model to learn low-dimensional representations of
hypernetworks with decomposable hyperedges
• Previous network embedding methods mainly focus on networks with pairwise relationships and
cannot preserve the decomposability property in hypernetworks using linear similarity metrics
in the embedding space.
• The DHNE model implements a nonlinear tuple similarity function that preserves both local and global
proximity in the formed embedding space.
• The DHNE model outperforms state of these algorithms in four types of experiments.
15

More Related Content

Similar to NS-CUK Seminar: H.E.Lee, Review on "Structural Deep Embedding for Hyper-Networks", AAAI 20218

Clique-based Network Clustering
Clique-based Network ClusteringClique-based Network Clustering
Clique-based Network Clustering
Guang Ouyang
 

Similar to NS-CUK Seminar: H.E.Lee, Review on "Structural Deep Embedding for Hyper-Networks", AAAI 20218 (20)

Neural Networks in Data Mining - “An Overview”
Neural Networks  in Data Mining -   “An Overview”Neural Networks  in Data Mining -   “An Overview”
Neural Networks in Data Mining - “An Overview”
 
Predicting rainfall using ensemble of ensembles
Predicting rainfall using ensemble of ensemblesPredicting rainfall using ensemble of ensembles
Predicting rainfall using ensemble of ensembles
 
[20240422_LabSeminar_Huy]Taming_Effect.pptx
[20240422_LabSeminar_Huy]Taming_Effect.pptx[20240422_LabSeminar_Huy]Taming_Effect.pptx
[20240422_LabSeminar_Huy]Taming_Effect.pptx
 
Deep learning from a novice perspective
Deep learning from a novice perspectiveDeep learning from a novice perspective
Deep learning from a novice perspective
 
NS - CUK Seminar: S.T.Nguyen, Review on "Hypergraph Neural Networks", AAAI 2019
NS - CUK Seminar: S.T.Nguyen, Review on "Hypergraph Neural Networks", AAAI 2019NS - CUK Seminar: S.T.Nguyen, Review on "Hypergraph Neural Networks", AAAI 2019
NS - CUK Seminar: S.T.Nguyen, Review on "Hypergraph Neural Networks", AAAI 2019
 
Ire presentation
Ire presentationIre presentation
Ire presentation
 
Understanding Large Social Networks | IRE Major Project | Team 57 | LINE
Understanding Large Social Networks | IRE Major Project | Team 57 | LINEUnderstanding Large Social Networks | IRE Major Project | Team 57 | LINE
Understanding Large Social Networks | IRE Major Project | Team 57 | LINE
 
Convolutional Neural Networks (CNN)
Convolutional Neural Networks (CNN)Convolutional Neural Networks (CNN)
Convolutional Neural Networks (CNN)
 
Exploring Randomly Wired Neural Networks for Image Recognition
Exploring Randomly Wired Neural Networks for Image RecognitionExploring Randomly Wired Neural Networks for Image Recognition
Exploring Randomly Wired Neural Networks for Image Recognition
 
Deep learning for 3-D Scene Reconstruction and Modeling
Deep learning for 3-D Scene Reconstruction and Modeling Deep learning for 3-D Scene Reconstruction and Modeling
Deep learning for 3-D Scene Reconstruction and Modeling
 
"Sparse Graph Attention Networks", IEEE Transactions on Knowledge and Data En...
"Sparse Graph Attention Networks", IEEE Transactions on Knowledge and Data En..."Sparse Graph Attention Networks", IEEE Transactions on Knowledge and Data En...
"Sparse Graph Attention Networks", IEEE Transactions on Knowledge and Data En...
 
ResNet.pptx
ResNet.pptxResNet.pptx
ResNet.pptx
 
Scale free network Visualiuzation
Scale free network VisualiuzationScale free network Visualiuzation
Scale free network Visualiuzation
 
UNET: Massive Scale DNN on Spark
UNET: Massive Scale DNN on SparkUNET: Massive Scale DNN on Spark
UNET: Massive Scale DNN on Spark
 
[PR12] Inception and Xception - Jaejun Yoo
[PR12] Inception and Xception - Jaejun Yoo[PR12] Inception and Xception - Jaejun Yoo
[PR12] Inception and Xception - Jaejun Yoo
 
ResNet.pptx
ResNet.pptxResNet.pptx
ResNet.pptx
 
Topology ppt
Topology pptTopology ppt
Topology ppt
 
A survey research summary on neural networks
A survey research summary on neural networksA survey research summary on neural networks
A survey research summary on neural networks
 
Convolutional neural network from VGG to DenseNet
Convolutional neural network from VGG to DenseNetConvolutional neural network from VGG to DenseNet
Convolutional neural network from VGG to DenseNet
 
Clique-based Network Clustering
Clique-based Network ClusteringClique-based Network Clustering
Clique-based Network Clustering
 

More from ssuser4b1f48

More from ssuser4b1f48 (20)

NS-CUK Seminar: V.T.Hoang, Review on "GOAT: A Global Transformer on Large-sca...
NS-CUK Seminar: V.T.Hoang, Review on "GOAT: A Global Transformer on Large-sca...NS-CUK Seminar: V.T.Hoang, Review on "GOAT: A Global Transformer on Large-sca...
NS-CUK Seminar: V.T.Hoang, Review on "GOAT: A Global Transformer on Large-sca...
 
NS-CUK Seminar: J.H.Lee, Review on "Graph Propagation Transformer for Graph R...
NS-CUK Seminar: J.H.Lee, Review on "Graph Propagation Transformer for Graph R...NS-CUK Seminar: J.H.Lee, Review on "Graph Propagation Transformer for Graph R...
NS-CUK Seminar: J.H.Lee, Review on "Graph Propagation Transformer for Graph R...
 
NS-CUK Seminar: H.B.Kim, Review on "Cluster-GCN: An Efficient Algorithm for ...
NS-CUK Seminar: H.B.Kim,  Review on "Cluster-GCN: An Efficient Algorithm for ...NS-CUK Seminar: H.B.Kim,  Review on "Cluster-GCN: An Efficient Algorithm for ...
NS-CUK Seminar: H.B.Kim, Review on "Cluster-GCN: An Efficient Algorithm for ...
 
NS-CUK Seminar: H.E.Lee, Review on "Weisfeiler and Leman Go Neural: Higher-O...
NS-CUK Seminar: H.E.Lee,  Review on "Weisfeiler and Leman Go Neural: Higher-O...NS-CUK Seminar: H.E.Lee,  Review on "Weisfeiler and Leman Go Neural: Higher-O...
NS-CUK Seminar: H.E.Lee, Review on "Weisfeiler and Leman Go Neural: Higher-O...
 
NS-CUK Seminar:V.T.Hoang, Review on "GRPE: Relative Positional Encoding for G...
NS-CUK Seminar:V.T.Hoang, Review on "GRPE: Relative Positional Encoding for G...NS-CUK Seminar:V.T.Hoang, Review on "GRPE: Relative Positional Encoding for G...
NS-CUK Seminar:V.T.Hoang, Review on "GRPE: Relative Positional Encoding for G...
 
NS-CUK Seminar: J.H.Lee, Review on "Learnable Structural Semantic Readout for...
NS-CUK Seminar: J.H.Lee, Review on "Learnable Structural Semantic Readout for...NS-CUK Seminar: J.H.Lee, Review on "Learnable Structural Semantic Readout for...
NS-CUK Seminar: J.H.Lee, Review on "Learnable Structural Semantic Readout for...
 
Aug 22nd, 2023: Case Studies - The Art and Science of Animation Production)
Aug 22nd, 2023: Case Studies - The Art and Science of Animation Production)Aug 22nd, 2023: Case Studies - The Art and Science of Animation Production)
Aug 22nd, 2023: Case Studies - The Art and Science of Animation Production)
 
Aug 17th, 2023: Case Studies - Examining Gamification through Virtual/Augment...
Aug 17th, 2023: Case Studies - Examining Gamification through Virtual/Augment...Aug 17th, 2023: Case Studies - Examining Gamification through Virtual/Augment...
Aug 17th, 2023: Case Studies - Examining Gamification through Virtual/Augment...
 
Aug 10th, 2023: Case Studies - The Power of eXtended Reality (XR) with 360°
Aug 10th, 2023: Case Studies - The Power of eXtended Reality (XR) with 360°Aug 10th, 2023: Case Studies - The Power of eXtended Reality (XR) with 360°
Aug 10th, 2023: Case Studies - The Power of eXtended Reality (XR) with 360°
 
Aug 8th, 2023: Case Studies - Utilizing eXtended Reality (XR) in Drones)
Aug 8th, 2023: Case Studies - Utilizing eXtended Reality (XR) in Drones)Aug 8th, 2023: Case Studies - Utilizing eXtended Reality (XR) in Drones)
Aug 8th, 2023: Case Studies - Utilizing eXtended Reality (XR) in Drones)
 
NS-CUK Seminar: J.H.Lee, Review on "Learnable Structural Semantic Readout for...
NS-CUK Seminar: J.H.Lee, Review on "Learnable Structural Semantic Readout for...NS-CUK Seminar: J.H.Lee, Review on "Learnable Structural Semantic Readout for...
NS-CUK Seminar: J.H.Lee, Review on "Learnable Structural Semantic Readout for...
 
NS-CUK Seminar: H.E.Lee, Review on "Gated Graph Sequence Neural Networks", I...
NS-CUK Seminar: H.E.Lee,  Review on "Gated Graph Sequence Neural Networks", I...NS-CUK Seminar: H.E.Lee,  Review on "Gated Graph Sequence Neural Networks", I...
NS-CUK Seminar: H.E.Lee, Review on "Gated Graph Sequence Neural Networks", I...
 
NS-CUK Seminar:V.T.Hoang, Review on "Augmentation-Free Self-Supervised Learni...
NS-CUK Seminar:V.T.Hoang, Review on "Augmentation-Free Self-Supervised Learni...NS-CUK Seminar:V.T.Hoang, Review on "Augmentation-Free Self-Supervised Learni...
NS-CUK Seminar:V.T.Hoang, Review on "Augmentation-Free Self-Supervised Learni...
 
NS-CUK Journal club: H.E.Lee, Review on " A biomedical knowledge graph-based ...
NS-CUK Journal club: H.E.Lee, Review on " A biomedical knowledge graph-based ...NS-CUK Journal club: H.E.Lee, Review on " A biomedical knowledge graph-based ...
NS-CUK Journal club: H.E.Lee, Review on " A biomedical knowledge graph-based ...
 
NS-CUK Seminar: H.E.Lee, Review on "PTE: Predictive Text Embedding through L...
NS-CUK Seminar: H.E.Lee,  Review on "PTE: Predictive Text Embedding through L...NS-CUK Seminar: H.E.Lee,  Review on "PTE: Predictive Text Embedding through L...
NS-CUK Seminar: H.E.Lee, Review on "PTE: Predictive Text Embedding through L...
 
NS-CUK Seminar: H.B.Kim, Review on "Inductive Representation Learning on Lar...
NS-CUK Seminar: H.B.Kim,  Review on "Inductive Representation Learning on Lar...NS-CUK Seminar: H.B.Kim,  Review on "Inductive Representation Learning on Lar...
NS-CUK Seminar: H.B.Kim, Review on "Inductive Representation Learning on Lar...
 
NS-CUK Seminar: H.E.Lee, Review on "PTE: Predictive Text Embedding through L...
NS-CUK Seminar: H.E.Lee,  Review on "PTE: Predictive Text Embedding through L...NS-CUK Seminar: H.E.Lee,  Review on "PTE: Predictive Text Embedding through L...
NS-CUK Seminar: H.E.Lee, Review on "PTE: Predictive Text Embedding through L...
 
NS-CUK Seminar: J.H.Lee, Review on "Relational Self-Supervised Learning on Gr...
NS-CUK Seminar: J.H.Lee, Review on "Relational Self-Supervised Learning on Gr...NS-CUK Seminar: J.H.Lee, Review on "Relational Self-Supervised Learning on Gr...
NS-CUK Seminar: J.H.Lee, Review on "Relational Self-Supervised Learning on Gr...
 
NS-CUK Seminar: H.E.Lee, Review on "Graph Star Net for Generalized Multi-Tas...
NS-CUK Seminar: H.E.Lee,  Review on "Graph Star Net for Generalized Multi-Tas...NS-CUK Seminar: H.E.Lee,  Review on "Graph Star Net for Generalized Multi-Tas...
NS-CUK Seminar: H.E.Lee, Review on "Graph Star Net for Generalized Multi-Tas...
 
NS-CUK Seminar: V.T.Hoang, Review on "Namkyeong Lee, et al. Relational Self-...
NS-CUK Seminar:  V.T.Hoang, Review on "Namkyeong Lee, et al. Relational Self-...NS-CUK Seminar:  V.T.Hoang, Review on "Namkyeong Lee, et al. Relational Self-...
NS-CUK Seminar: V.T.Hoang, Review on "Namkyeong Lee, et al. Relational Self-...
 

Recently uploaded

Essentials of Automations: Optimizing FME Workflows with Parameters
Essentials of Automations: Optimizing FME Workflows with ParametersEssentials of Automations: Optimizing FME Workflows with Parameters
Essentials of Automations: Optimizing FME Workflows with Parameters
Safe Software
 
Search and Society: Reimagining Information Access for Radical Futures
Search and Society: Reimagining Information Access for Radical FuturesSearch and Society: Reimagining Information Access for Radical Futures
Search and Society: Reimagining Information Access for Radical Futures
Bhaskar Mitra
 

Recently uploaded (20)

JMeter webinar - integration with InfluxDB and Grafana
JMeter webinar - integration with InfluxDB and GrafanaJMeter webinar - integration with InfluxDB and Grafana
JMeter webinar - integration with InfluxDB and Grafana
 
Behind the Scenes From the Manager's Chair: Decoding the Secrets of Successfu...
Behind the Scenes From the Manager's Chair: Decoding the Secrets of Successfu...Behind the Scenes From the Manager's Chair: Decoding the Secrets of Successfu...
Behind the Scenes From the Manager's Chair: Decoding the Secrets of Successfu...
 
In-Depth Performance Testing Guide for IT Professionals
In-Depth Performance Testing Guide for IT ProfessionalsIn-Depth Performance Testing Guide for IT Professionals
In-Depth Performance Testing Guide for IT Professionals
 
Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...
Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...
Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...
 
Salesforce Adoption – Metrics, Methods, and Motivation, Antone Kom
Salesforce Adoption – Metrics, Methods, and Motivation, Antone KomSalesforce Adoption – Metrics, Methods, and Motivation, Antone Kom
Salesforce Adoption – Metrics, Methods, and Motivation, Antone Kom
 
Measures in SQL (a talk at SF Distributed Systems meetup, 2024-05-22)
Measures in SQL (a talk at SF Distributed Systems meetup, 2024-05-22)Measures in SQL (a talk at SF Distributed Systems meetup, 2024-05-22)
Measures in SQL (a talk at SF Distributed Systems meetup, 2024-05-22)
 
IESVE for Early Stage Design and Planning
IESVE for Early Stage Design and PlanningIESVE for Early Stage Design and Planning
IESVE for Early Stage Design and Planning
 
Essentials of Automations: Optimizing FME Workflows with Parameters
Essentials of Automations: Optimizing FME Workflows with ParametersEssentials of Automations: Optimizing FME Workflows with Parameters
Essentials of Automations: Optimizing FME Workflows with Parameters
 
Connector Corner: Automate dynamic content and events by pushing a button
Connector Corner: Automate dynamic content and events by pushing a buttonConnector Corner: Automate dynamic content and events by pushing a button
Connector Corner: Automate dynamic content and events by pushing a button
 
Designing Great Products: The Power of Design and Leadership by Chief Designe...
Designing Great Products: The Power of Design and Leadership by Chief Designe...Designing Great Products: The Power of Design and Leadership by Chief Designe...
Designing Great Products: The Power of Design and Leadership by Chief Designe...
 
Free and Effective: Making Flows Publicly Accessible, Yumi Ibrahimzade
Free and Effective: Making Flows Publicly Accessible, Yumi IbrahimzadeFree and Effective: Making Flows Publicly Accessible, Yumi Ibrahimzade
Free and Effective: Making Flows Publicly Accessible, Yumi Ibrahimzade
 
Speed Wins: From Kafka to APIs in Minutes
Speed Wins: From Kafka to APIs in MinutesSpeed Wins: From Kafka to APIs in Minutes
Speed Wins: From Kafka to APIs in Minutes
 
AI revolution and Salesforce, Jiří Karpíšek
AI revolution and Salesforce, Jiří KarpíšekAI revolution and Salesforce, Jiří Karpíšek
AI revolution and Salesforce, Jiří Karpíšek
 
Search and Society: Reimagining Information Access for Radical Futures
Search and Society: Reimagining Information Access for Radical FuturesSearch and Society: Reimagining Information Access for Radical Futures
Search and Society: Reimagining Information Access for Radical Futures
 
De-mystifying Zero to One: Design Informed Techniques for Greenfield Innovati...
De-mystifying Zero to One: Design Informed Techniques for Greenfield Innovati...De-mystifying Zero to One: Design Informed Techniques for Greenfield Innovati...
De-mystifying Zero to One: Design Informed Techniques for Greenfield Innovati...
 
10 Differences between Sales Cloud and CPQ, Blanka Doktorová
10 Differences between Sales Cloud and CPQ, Blanka Doktorová10 Differences between Sales Cloud and CPQ, Blanka Doktorová
10 Differences between Sales Cloud and CPQ, Blanka Doktorová
 
When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...
 
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
 
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
 
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024
 

NS-CUK Seminar: H.E.Lee, Review on "Structural Deep Embedding for Hyper-Networks", AAAI 20218

  • 1. Hyo Eun Lee Network Science Lab Dept. of Biotechnology The Catholic University of Korea E-mail: gydnsml@gmail.com 2023.07.17 AAAI 2018
  • 2. 1  Notations and Definitions  Deep Hyper-Network Embedding • Loss function • Optimization  Experiment • Datasets and Parameter Settings • Network Reconstruction • Link Prediction and Classification • Parameter Sensitivity  Conclusion
  • 3. 2 1. Notations and Definitions Definitions • Definitions 1. : 𝐺 = 𝑉 , 𝐸 𝑉 = 𝑉𝑡 𝑡=1 𝑇 , 𝐸 = 𝐸𝑖 = 𝑣1, … , 𝑣𝑛𝑖 𝑛𝑖 ≥ 2  If the number of nodes is 2 for each hyperedge, the hyper-network degenerates to a network.  If T ≥ 2, the hypernetwork is defined as a heterogeneous hyper-network. • Definitions 2. (first-order proximity) : First-order proximity in hypernetworks measures the N-tuple similarity between nodes.  For 𝑁 nodes, if a hyperedge exists, their N-tuplewise similarity is defined as 1.  No information about subset • Definitions 3. (second-order proximity) : Node-based similarity shared by hyperedge pairs  Consider neighboring structures to discover hidden patterns or relationships.
  • 4. 3 2. Deep Hyper-Network Embedding Loss function • If hyperedges exist between the 𝑁 vertices, the N-tuple similarity(𝑆) must be large, otherwise it must be small. • Property 1.  𝑖𝑓 𝑣1, 𝑣2, … , 𝑣𝑁 ∈ 𝐸, 𝑆 𝑋1, 𝑋2, … , 𝑋𝑁 𝑠. 𝑡. 𝑆 𝑙𝑎𝑟𝑔𝑒  𝑖𝑓 𝑣1, 𝑣2, … , 𝑣𝑁 ∉ 𝐸, 𝑆 𝑋1, 𝑋2, … , 𝑋𝑁 𝑠. 𝑡. 𝑆 𝑙𝑎𝑟𝑔𝑒 + In this paper, 𝑁 = 3 • Theorem 1. 𝐿𝑖𝑗𝑘 = 𝜎 𝑊 𝑎 2 ∗ 𝑋𝑖 𝑎 + 𝑊𝑏 2 ∗ 𝑋𝑗 𝑏 + 𝑊 𝑐 2 ∗ 𝑋𝑘 𝑐 + 𝑏2 𝑆𝑖𝑗𝑘 = 𝑆 𝑋𝑖 𝑎 , 𝑋𝑗 𝑏 , 𝑋𝑘 𝑐 = 𝜎 𝑊3 ∗ 𝐿𝑖𝑗𝑘 + 𝑏3 Fill in this black
  • 5. 4 2. Deep Hyper-Network Embedding Loss function • Preserve neighborhood structure using an autoencoder as a model 𝐴 = 𝐻𝐻𝑇 − 𝐷𝑣 • Encoder ) 𝑋𝑖 = 𝜎(𝑊1 ∗ 𝐴𝑖 + 𝑏1 ) • Decoder ) 𝐴𝑖 = 𝜎(𝑊1 ∗ 𝑋𝑖 + 𝑏1 ) 𝑠𝑖𝑔𝑛 𝐴𝑖 ⨀ 𝐴𝑖 − 𝐴𝑖 𝐹 2 ∴ ℒ2 = 𝑡 𝑠𝑖𝑔𝑛 𝐴𝑖 𝑡 ⨀ 𝐴𝑖 𝑡 − 𝐴𝑖 𝑡 𝐹 2 ∴ ℒ = ℒ1 + 𝛼ℒ2 Fill in this black
  • 6. 5 2. Deep Hyper-Network Embedding Optimization • Optimization : SGD • Sampling multiple negatives based on a noisy distribution and using them in a backpropagation algorithm • Out-of-sample extension : Find the neighbor vector of the new arrival vertex and feed it to the relevant autoencoder 𝑂(𝑑𝑣𝑑) • Complexity analysis : 𝑂 𝑛𝑑 + 𝑑𝑙 + 𝑙 𝑏𝐼 , Complexity is defined by the number of nodes Fill in this black
  • 7. 6 3. Experiment Datasets • GPS network, a social network, a medicine network and a semantic network Methods • Conventional pairwise network embedding methods) DeepWalk, LINE, node2vec → use clique expansion • SHE : homogeneous hypernetwork embeddings, tensor method is a direct way for preserving high-order relationship • HEBE : heterogeneous event data
  • 8. 7 Parameter Settings • Represents the tuple similarity of hyperedges using the average or minimum of all pairwise similarities for application to network reconstruction and link prediction in hypernetworks.  DeepWalk and node2vec : window size – 10, walk length – 40, walks per vertex -10  LINE : the number of negative samples – 5, 𝑝0 = 0.025 • DHNE : use one-layer autoencoder and fully connected layer 3. Experiment
  • 14. 13 4. Contribution Contribution • Propose a DHNE model to learn low-dimensional representations of hypernetworks with decomposable hyperedges • Previous network embedding methods mainly focus on networks with pairwise relationships and cannot preserve the decomposability property in hypernetworks using linear similarity metrics in the embedding space. • The DHNE model implements a nonlinear tuple similarity function that preserves both local and global proximity in the formed embedding space. • The DHNE model outperforms state of these algorithms in four types of experiments.
  • 15.
  • 16. 15

Editor's Notes

  1. R : 이진변수 있 →1 없 →0 ⇒ 유사성 S 계산 , 얼마나 가깝게 유사한지 측정 (1차 근접)
  2. R : 이진변수 있 →1 없 →0 ⇒ 유사성 S 계산 , 얼마나 가깝게 유사한지 측정 (1차 근접)
  3. R : 이진변수 있 →1 없 →0 ⇒ 유사성 S 계산 , 얼마나 가깝게 유사한지 측정 (1차 근접)
  4. LINE, DeepWalk, SHE 및 DHNE의 성능을 비교한 결과, 구성 불가능한 고차 관계를 다중 쌍별 관계로 변환하면 학습된 임베딩의 예측력이 손상될 수 있음을 알 수 있습니다. Tensor와 HEBE는 하이퍼에지의 복잡성을 어느 정도 해결할 수 있지만, 이 두 가지 방법에 비해 DHNE의 큰 개선 마진은 하이퍼 네트워크 임베딩에서 2차 근접성의 중요성을 분명히 보여줍니다.
  5. 첫 번째 작업은 기존 엣지의 20% 를 무작위로 숨기고 나머지 네트워크를 사용하여 하이퍼네트워크 임베딩 모델을 학습시키는 것입니다. 학습 후 작성자는 각 노드에 대한 임베딩과 N개 노드에 대한 유사성 함수를 얻어 홀드아웃 링크를 예측하는 데 적용합니다. 두 번째 과제는 네트워크의 다양한 희소성 수준에서 성능을 평가합니다. 성능을 평가하기 위해 작성자는 네트워크 재구성 작업에서 일반적으로 사용되는 지표인 AUC 값을 계산합니다. AUC 값은 모델이 긍정적인 예와 부정적인 예를 구별하는 능력을 측정합니다. 작성자는 AUC 값을 사용하여 두 작업 모두에서 모델의 성능을 평가합니다. 링크 예측 작업의 결과는 표 4에 나와 있으며 GPS 데이터 세트의 성능은 ROC 곡선을 사용하여 그림 3 (왼쪽) 에 나와 있습니다. 그림 3은 다양한 유형의 하이퍼네트워크에서 제안된 딥 하이퍼 네트워크 임베딩 (DHNE) 모델의 성능을 그래픽으로 나타낸 것입니다. 그림의 왼쪽은 GPS 네트워크의 수신기 작동 특성 (ROC) 곡선을 보여줍니다.ROC 곡선은 여러 임계값에 대한 참양성률과 위양성률을 나타낸 도표입니다.이진 분류 모델의 성능을 평가하는 데 사용됩니다. 그림의 오른쪽은 희소성이 다른 네트워크에서 링크 예측에 대한 DHNE 모델의 성능을 보여줍니다.링크 예측은 네트워크에서 누락된 링크 또는 향후 링크를 예측하는 작업입니다.희소성은 네트워크에서 누락된 링크의 비율을 나타냅니다. 성능은 ROC 곡선의 요약 통계인 곡선 아래 면적 (AUC) 메트릭을 사용하여 측정됩니다.AUC 범위는 0에서 1이며 값이 높을수록 성능이 더 좋습니다. 그림 오른쪽의 x축은 네트워크의 희소성을 나타내고 y축은 AUC 점수를 나타냅니다.희소성 범위는 0.1에서 0.9 사이이며, 0.1은 조밀한 네트워크를 나타내고 0.9는 희소 네트워크를 나타냅니다. DHNE 모델은 AUC 점수가 높을수록 알 수 있듯이 네 가지 유형의 하이퍼네트워크 모두에서 지속적으로 최첨단 알고리즘을 능가합니다.이는 DHNE 모델이 구성 불가능한 하이퍼에지가 있는 하이퍼네트워크를 내장하는 데 효과적임을 나타냅니다.
  6. MovieLens 데이터 세트는 추천 시스템 분야에서 널리 사용되는 데이터 세트로, 영화 등급 및 장르 및 개봉 연도와 같은 메타 데이터를 포함합니다.워드넷 데이터셋은 영어 단어의 어휘 데이터베이스로, 단어를 synset이라고 하는 동의어 세트로 그룹화하고 각 synset은 의미론적 관계를 통해 다른 동의어와 연결됩니다. 작성자가 이 두 데이터 세트를 선택한 이유는 분류 작업을 수행하는 데 필요한 레이블 또는 범주 정보가 있는 유일한 데이터 세트이기 때문입니다. 노드 임베딩은 다양한 방법에서 파생됩니다. 즉, 작성자는 다양한 기술을 사용하여 데이터 세트의 노드를 고차원 공간의 벡터로 표현합니다.이러한 임베딩은 분류 작업을 위한 입력 기능으로 사용됩니다. 서포트 벡터 머신 (SVM) 이 분류 작업을 위한 분류기로 선택됩니다.SVM은 이진 및 다중 클래스 분류 문제 모두에 사용할 수 있는 널리 사용되는 기계 학습 알고리즘입니다. 작성자는 데이터 세트의 꼭짓점 (노드) 일부를 학습 샘플로 무작위로 샘플링하고 나머지 노드를 사용하여 분류기의 성능을 테스트합니다.학습 데이터의 비율은 MovieLens 데이터셋의 경우 10% 에서 90% 까지, 워드넷 데이터셋의 경우 1% 에서 10% 까지 다양합니다. 작성자는 데이터셋에서 레이블이 없는 노드를 제거합니다. 즉, 특정 범주 또는 클래스에 속하는 노드만 분류 작업에 사용됩니다. 분류 작업의 성능은 평균 매크로-F1과 Micro-F1이라는 두 가지 지표를 사용하여 평가됩니다.이러한 지표는 테스트 데이터의 올바른 레이블을 예측하는 분류기의 정확도를 측정합니다. 분류 작업의 결과는 그림 4에 나와 있으며 강조 표시된 텍스트에는 자세히 설명되어 있지 않습니다.
  7. 차 근접 손실과 2차 근접 손실의 비율 (α) 및 임베딩 차원 (d) 에 대한 민감도 분석에 대해 설명합니다.