PR-155: Exploring Randomly Wired Neural Networks for Image RecognitionJinwon Lee
TensorFlow-KR 논문읽기모임 PR12 155번째 논문 review 입니다.
이번에는 Facebook AI Research에서 최근에 나온(4/2) Exploring Randomly Wired Neural Networks for Image Recognition을 review해 보았습니다. random하게 generation된 network이 그동안 사람들이 온갖 노력을 들여서 만든 network 이상의 성능을 나타낸다는 결과로 많은 사람들에게 충격을 준 논문인데요, 자세한 내용은 자료와 영상을 참고해주세요
논문링크: https://arxiv.org/abs/1904.01569
영상링크: https://youtu.be/NrmLteQ5BC4
PR-155: Exploring Randomly Wired Neural Networks for Image RecognitionJinwon Lee
TensorFlow-KR 논문읽기모임 PR12 155번째 논문 review 입니다.
이번에는 Facebook AI Research에서 최근에 나온(4/2) Exploring Randomly Wired Neural Networks for Image Recognition을 review해 보았습니다. random하게 generation된 network이 그동안 사람들이 온갖 노력을 들여서 만든 network 이상의 성능을 나타낸다는 결과로 많은 사람들에게 충격을 준 논문인데요, 자세한 내용은 자료와 영상을 참고해주세요
논문링크: https://arxiv.org/abs/1904.01569
영상링크: https://youtu.be/NrmLteQ5BC4
Characteristics of Networks Generated by Kernel Growing Neural Gasgerogepatton
This research aims to develop kernel GNG, a kernelized version of the growing neural gas (GNG)
algorithm, and to investigate the features of the networks generated by the kernel GNG. The GNG is an
unsupervised artificial neural network that can transform a dataset into an undirected graph, thereby
extracting the features of the dataset as a graph. The GNG is widely used in vector quantization,
clustering, and 3D graphics. Kernel methods are often used to map a dataset to feature space, with support
vector machines being the most prominent application. This paper introduces the kernel GNG approach
and explores the characteristics of the networks generated by kernel GNG. Five kernels, including
Gaussian, Laplacian, Cauchy, inverse multiquadric, and log kernels, are used in this study. The results of
this study show that the average degree and the average clustering coefficient decrease as the kernel
parameter increases for Gaussian, Laplacian, Cauchy, and IMQ kernels. If we avoid more edges and a
higher clustering coefficient (or more triangles), the kernel GNG with a larger value of the parameter will
be more appropriate.
CHARACTERISTICS OF NETWORKS GENERATED BY KERNEL GROWING NEURAL GASijaia
This research aims to develop kernel GNG, a kernelized version of the growing neural gas (GNG)
algorithm, and to investigate the features of the networks generated by the kernel GNG. The GNG is an
unsupervised artificial neural network that can transform a dataset into an undirected graph, thereby
extracting the features of the dataset as a graph. The GNG is widely used in vector quantization,
clustering, and 3D graphics. Kernel methods are often used to map a dataset to feature space, with support
vector machines being the most prominent application. This paper introduces the kernel GNG approach
and explores the characteristics of the networks generated by kernel GNG. Five kernels, including
Gaussian, Laplacian, Cauchy, inverse multiquadric, and log kernels, are used in this study. The results of
this study show that the average degree and the average clustering coefficient decrease as the kernel
parameter increases for Gaussian, Laplacian, Cauchy, and IMQ kernels. If we avoid more edges and a
higher clustering coefficient (or more triangles), the kernel GNG with a larger value of the parameter will
be more appropriate.
Bio-inspired Algorithms for Evolving the Architecture of Convolutional Neural...Ashray Bhandare
In this thesis, three bio-inspired algorithms viz. genetic algorithm, particle swarm optimizer (PSO) and grey wolf optimizer (GWO) are used to optimally determine the architecture of a convolutional neural network (CNN) that is used to classify handwritten numbers. The CNN is a class of deep feed-forward network, which have seen major success in the field of visual image analysis. During training, a good CNN architecture is capable of extracting complex features from the given training data; however, at present, there is no standard way to determine the architecture of a CNN. Domain knowledge and human expertise are required in order to design a CNN architecture. Typically architectures are created by experimenting and modifying a few existing networks.
The bio-inspired algorithms determine the exact architecture of a CNN by evolving the various hyperparameters of the architecture for a given application. The proposed method was tested on the MNIST dataset, which is a large database of handwritten digits that is commonly used in many machine-learning models. The experiment was carried out on an Amazon Web Services (AWS) GPU instance, which helped to speed up the experiment time. The performance of all three algorithms was comparatively studied. The results show that the bio-inspired algorithms are capable of generating successful CNN architectures. The proposed method performs the entire process of architecture generation without any human intervention.
Characteristics of Networks Generated by Kernel Growing Neural Gasgerogepatton
This research aims to develop kernel GNG, a kernelized version of the growing neural gas (GNG)
algorithm, and to investigate the features of the networks generated by the kernel GNG. The GNG is an
unsupervised artificial neural network that can transform a dataset into an undirected graph, thereby
extracting the features of the dataset as a graph. The GNG is widely used in vector quantization,
clustering, and 3D graphics. Kernel methods are often used to map a dataset to feature space, with support
vector machines being the most prominent application. This paper introduces the kernel GNG approach
and explores the characteristics of the networks generated by kernel GNG. Five kernels, including
Gaussian, Laplacian, Cauchy, inverse multiquadric, and log kernels, are used in this study. The results of
this study show that the average degree and the average clustering coefficient decrease as the kernel
parameter increases for Gaussian, Laplacian, Cauchy, and IMQ kernels. If we avoid more edges and a
higher clustering coefficient (or more triangles), the kernel GNG with a larger value of the parameter will
be more appropriate.
CHARACTERISTICS OF NETWORKS GENERATED BY KERNEL GROWING NEURAL GASijaia
This research aims to develop kernel GNG, a kernelized version of the growing neural gas (GNG)
algorithm, and to investigate the features of the networks generated by the kernel GNG. The GNG is an
unsupervised artificial neural network that can transform a dataset into an undirected graph, thereby
extracting the features of the dataset as a graph. The GNG is widely used in vector quantization,
clustering, and 3D graphics. Kernel methods are often used to map a dataset to feature space, with support
vector machines being the most prominent application. This paper introduces the kernel GNG approach
and explores the characteristics of the networks generated by kernel GNG. Five kernels, including
Gaussian, Laplacian, Cauchy, inverse multiquadric, and log kernels, are used in this study. The results of
this study show that the average degree and the average clustering coefficient decrease as the kernel
parameter increases for Gaussian, Laplacian, Cauchy, and IMQ kernels. If we avoid more edges and a
higher clustering coefficient (or more triangles), the kernel GNG with a larger value of the parameter will
be more appropriate.
Bio-inspired Algorithms for Evolving the Architecture of Convolutional Neural...Ashray Bhandare
In this thesis, three bio-inspired algorithms viz. genetic algorithm, particle swarm optimizer (PSO) and grey wolf optimizer (GWO) are used to optimally determine the architecture of a convolutional neural network (CNN) that is used to classify handwritten numbers. The CNN is a class of deep feed-forward network, which have seen major success in the field of visual image analysis. During training, a good CNN architecture is capable of extracting complex features from the given training data; however, at present, there is no standard way to determine the architecture of a CNN. Domain knowledge and human expertise are required in order to design a CNN architecture. Typically architectures are created by experimenting and modifying a few existing networks.
The bio-inspired algorithms determine the exact architecture of a CNN by evolving the various hyperparameters of the architecture for a given application. The proposed method was tested on the MNIST dataset, which is a large database of handwritten digits that is commonly used in many machine-learning models. The experiment was carried out on an Amazon Web Services (AWS) GPU instance, which helped to speed up the experiment time. The performance of all three algorithms was comparatively studied. The results show that the bio-inspired algorithms are capable of generating successful CNN architectures. The proposed method performs the entire process of architecture generation without any human intervention.
Similar to NS-CUK Seminar: S.T.Nguyen, Review on "Improving Graph Neural Network Expressivity via Subgraph Isomorphism Counting", IEEE 2020 (20)
NS-CUK Journal club: H.E.Lee, Review on " A biomedical knowledge graph-based ...ssuser4b1f48
NS-CUK Journal club: H.E.Lee, Review on " A biomedical knowledge graph-based method for drug–drug interactions prediction through combining local and global features with deep neural networks", Bioinformatics 2022
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
UiPath Test Automation using UiPath Test Suite series, part 3
NS-CUK Seminar: S.T.Nguyen, Review on "Improving Graph Neural Network Expressivity via Subgraph Isomorphism Counting", IEEE 2020
1. LAB SEMINAR
Nguyen Thanh Sang
Network Science Lab
Dept. of Artificial Intelligence
The Catholic University of Korea
E-mail: sang.ngt99@gmail.com
Improving Graph Neural Network Expressivity via
Subgraph Isomorphism Counting
--- Giorgos Bouritsas , Fabrizio Frasca, Stefanos Zafeiriou, and Michael M.
Bronstein ---
2023-06-01
3. 2
Introduction
Graph Neural Networks (GNNs) have achieved remarkable results in a variety of applications.
GNNs use an aggregation function to update the vector representation of each node by transforming and
aggregating the vector representations of its neighbours.
4. 3
Graph Isomorphism
+ Two graphs are also called isomorphic whenever there exists an isomorphism between the two.
+ In graph theory, an isomorphism of graphs 𝐺 and 𝐻
• A bijection between the vertex sets of 𝐺 and H: 𝐹:𝑉(𝐺)→𝑉(𝐻)
• Any two vertices 𝑢 and 𝑣 of 𝐺 are adjacent in 𝐺 if and only if 𝑓(𝑢) and 𝑓(𝑣) are adjacent in 𝐻
5. 4
Graph Automorphism
+ A bijection mapping onto itself
• When 𝐺 and 𝐻 are one and the same graph
• A form of symmetry
+ Problem
• Testing whether a graph has a nontrivial automorphism
=> Computational complexity
• Constructing the automorphism group
=> Orbit
6. 5
Problems
❖ The Weisfeiler-Lehman test: representative test for isomorphism
• Low computational complexity
• Good for all graphs
Limit in some case
• Not apply in real world data.
Arbitrarily initialized for test.
Initial 1st iteration 2nd iteration 3rd iteration
7. 6
Problems
• Since message-passing GNNs are at most as powerful as the Weisfeiler Leman test (WL), they
are limited in their abilities to adequately exploit the graph structure, e.g. by counting
substructures.
important in the study of complex networks.
How to go beyond isotropic, i.e., locally symmetric, aggregation functions?
How to ensure e structural characteristics of the graph?
How to achieve the above two without sacrificing invariance to isomorphism?
8. 7
Contributions
• Break local symmetries by introducing structural information in the aggregation function.
• Each neighbour (message) is transformed differently depending on its structural relationship with
the central node.
counting the appearance of certain substructures.
• Graph Substructure Network (GSN) is strictly more expressive than traditional GNNs for the vast
majority of substructures, while retaining the locality of message passing, as opposed to higher-
order methods.
• When choosing the structural inductive biases based on domainspecific knowledge, GSN
achieves state-of-the-art results
9. 8
Structural Features
+ Features encoded from structural roles by counting the appearance
of certain substructures.
+ Step 1: A set of small (connected) graphs 𝐻 ∈ ℋ, ℋ = 𝐻1, 𝐻2, ⋯ , 𝐻𝐾,
e.g., cycles, paths, cliques, or trees
- Find its(each graph 𝐻 ∈ ℋ) isomorphic subgraphs in 𝐺 denoted 𝐺𝑆
- For each node 𝑣 ∈ 𝑉𝐺𝑆
, infer its role w.r.t 𝐻 by obtaining the orbit of its mapping 𝑓 𝑣 in 𝐻, Orb𝐻 𝑓 𝑣
+ Step 2: the 𝑣𝑒𝑟𝑡𝑒𝑥 𝑠𝑡𝑟𝑢𝑐𝑡𝑢𝑟𝑎𝑙 𝑓𝑒𝑎𝑡𝑢𝑟𝑒 𝐱𝐻
𝑉
𝑣 of 𝑣 by counting all the possible appearances of different orbits in 𝑣:
- For all 𝑖 ∈ 1, 2, ⋯ , 𝑑𝐻 : 𝐱𝐻
𝑉
𝑣 = 𝐺𝑆 ≃ 𝐻 𝑣 ∈ 𝑉𝐺𝑆
, 𝑓 𝑣 ∈ 𝑂𝐻,𝑖
𝑉
the number of elements in the set of nodes used in the orbit that make up a specific isomorphic mapping
𝑓: functions can map a subgraph 𝐺𝑆 to 𝐻
can be used to determine the orbit mapping of each node 𝑣
- Feature vector: 𝐱𝑣
𝑉
= 𝐱𝐻,1
𝑉
𝑣 , 𝐱𝐻,2
𝑉
𝑣 , ⋯ , 𝐱𝐻,𝐾
𝑉
𝑣
- The 𝑒𝑑𝑔𝑒 𝑠𝑡𝑟𝑢𝑐𝑡𝑢𝑟𝑎𝑙 𝑓𝑒𝑎𝑡𝑢𝑟𝑒 𝐱𝐻
𝐸
𝑢, 𝑣 of 𝑢, 𝑣 : 𝐱𝐻
𝐸
𝑢, 𝑣 = 𝐺𝑆 ≃ 𝐻 𝑢, 𝑣 ∈ ℰ𝐺𝑆
, 𝑓 𝑢 , 𝑓 𝑣 ∈ 𝑂𝐻,𝑖
𝐸
𝐱𝑢,𝑣
𝐸
= 𝐱𝐻,1
𝐸
𝑢, 𝑣 , 𝐱𝐻,2
𝐸
𝑢, 𝑣 , ⋯ , 𝐱𝐻,𝐾
𝐸
𝑢, 𝑣
10. 9
Structure-aware Message Passing
The substructure layer as a Message Passing Neural Network:
[Message Info.] + [Structural Roles Info.]
𝐡𝑡+1 = UP𝑡+1 𝐡𝑣
𝑡 , 𝐦𝑣
𝑡+1
UP𝑡+1 : an arbitrary function approximator (e.g., an MLP)
𝑀𝑡+1
: the neighborhood aggregation function
An arbitrary function on multisets
𝐞𝑢,𝑣: the edge features
the vertex structural identifiers
the edge structural identifiers
11. 10
Power of GSNs
+ GSN > MPNN: MPNN-based architecture
+ GSN > 1-WL: Considering possible all orbits
+ Open Problem: the fixed subgraph has not been
defined yet
Rook's 4x4 graph Shrikhande graph
(4-clique) (triangle)
2-FWL fails
12. 11
Experiments
Settings
+ Baseline: MPNN with MLP
+ Substructure families: Cycles, paths, trees and cliques
+ Substructure size: k
+ Datasets: Synthetic, TUD, ZINC, and OGB-MOLHIV
13. 12
Synthetic Graph Isomorphism Test
+ Dataset: a collection of Strongly Regular graphs of size up to 35 nodes
Isomorphic decision
+ The Euclidean distance of their representations is smaller than a predefined threshold 𝜖.
+ The number of failure cases of GSN decreases rapidly as we increase k; cycles and
paths of maximum length k = 6.
14. 13
TUD Graph Classification
• Dataset: Bioinformatics, Social networks
• Comparison: GNN, Graph Kernels with 10-fold cv
• Base architecture: GIN
• Best performing substructures both for GSN-e and GSN-v
=> The proposed model obtains SOTA performance in most of the datasets, with a considerable margin
against the main GNN baselines in some cases.
15. 14
ZINC Molecular Graphs
• Dataset
+ Commercially-available compounds for
virtual screening
+ John J. Irwin et al.
+ Graph regression (mainly)
• Task
+ k-cycle counting
+ Molecule: 10k / 2k
+ Regression (MAE)
=> GSN achieves state-of-the-art results
outperforming all the baseline architectures.
16. 15
OGB-MOLHIV
• GSN seamlessly improves the performance of the base architecture
• Cyclical substructures are a good inductive bias when learning on molecules, confirming our
results on the ZINC dataset, while the same holds for triangles in PPA networks. Tasks defined on
graphs with community structure correlate with the presence of triangles (or cliques), as was the
case for social networks in the TU Datasets experiments.
• General purpose GNNs benefit from symmetry breaking mechanisms, either in the form of
eigenvectors (DGN) or in the form of substructures.
17. 16
Ablation Studies
• The test error is not guaranteed to decrease when the identifiers become more discriminative.
• This method fails to improve the baseline architecture in terms of the performance in the test set.
unique identifiers can be hard to generalise when chosen in a non-permutation equivariant way and
motivates once more the importance of choosing the identifiers not only based on their
discriminative power, but also in a way that allows incorporating the appropriate inductive biases.
• GSN manages to generalize much better even with a small fraction of the training dataset.
18. 17
Conclusions
• A novel way to design structure aware graph neural networks. Motivated by the limitations of traditional
GNNs to capture important topological properties of the graph.
• A message passing scheme enhanced with structural features that are extracted by subgraph
isomorphism.
• For some types of substructures such as paths and cycles the counting can be done with significantly
lower complexity.
• The computationally expensive step is done only once as preprocessing and thus does not affect
network training and inference that remain linear, the same way as in message-passing neural
networks. The memory complexity in training and inference is linear as well
• Most importantly, the expressive power of GSN is different from k-WL tests and in some cases is
stronger