论文网址:HGNN+: General Hypergraph Neural Networks | IEEE Journals & Magazine | IEEE Xplore
英文是纯手打的!论文原文的summarizing and paraphrasing。可能会出现难以避免的拼写错误和语法错误,若有发现欢迎评论指正!文章偏向于笔记,谨慎食用
目录
1. 心得
2. 论文逐段精读
2.1. Abstract
2.2. Introduction
2.3. Related Work
2.3.1. Graph Neural Networks
2.3.2. Hypergraph Learning
2.4. Preliminaries of Hypergraphs
2.5. The Framework of Hypergraph Neural Network HGNN+
2.5.1. Hypergraph Modeling
2.5.2. Hypergraph Convolution
2.6. Discussions
2.6.1. Hypergraph vs. Graph
2.6.2. HGNN/HGNN vs. GNN+
2.6.3. HGNN vs. HGNN+
2.7. Experiments and Discussions
2.7.1. Experimental Settings
2.7.2. Vertex Classification on the Data With Graph Structure
2.7.3. Vertex Classification on the Data Without Graph Structure
2.7.4. Vertex Classification on the Data With Hypergraph Structure
2.7.5. Visualization
2.8. THU-DeepHypergraph: An Open Toolbox of the HGNN Framework+
2.9. Conclusion
3. Reference
1. 心得
(1)经典还是得读一下
2. 论文逐段精读
2.1. Abstract
①Limitations of GNN: feature extraction on multi modal/type
②Limitations of existing HGNN: shared weight between modalities/types
③So, they proposed HGNN+ and THU-DeepHypergraph toolbox
2.2. Introduction
①High order relationship:
②Framework of HGNN+:
③The conception of graph and hypergraph:
④Hyperedge group: pairwise edge, attribute, k-Hop and neighbors
⑤Advantages of HGNN+: not only for adaptive aggregation, but also expand to directed hypergraph
2.3. Related Work
2.3.1. Graph Neural Networks
①Lists spectral and spatial based GNN
2.3.2. Hypergraph Learning
①Lists the development of hypergraph and hypergraph neural network
2.4. Preliminaries of Hypergraphs
①Notations:
for graph ,
incidence matrix
can be written as:
and the initial node feature is , where
denotes the dimension of feature
②Optimization goal in node classfication task:
where is a regularizer on hypergraph,
denotes the supervised empirical loss and
is classification function
③Regularizer can be represented by:
2.5. The Framework of Hypergraph Neural Network HGNN+
2.5.1. Hypergraph Modeling
(1)Hyperedge Group Generation
①When the data correlation is with graph structure:
(a top) pairwise edge | Similar to traditional graph: |
(a bottom) k-hop neighbor | The k-hop neighbor is:
|
②When the data correlation is without graph structure:
(b top) attributes | Hyperedges are constructed by the same attributes (geo-locations, time and other specific information)(比如每个节点都是一篇论文的话,综述论文共享一条超边,研究论文共享一条超边,workshop共享一条等等). The construction of hyperedge: |
(b bottom) feature | k nearest neighbor in feature space: |
(2)Combination of Hyperedge Groups
①Coequal fusion of each hyperedge group:
②Adaptive fusion:
where denotes trainable parameter shared by all hyperedges inside a specified hyperedge group
,
returns a vector of size
with all the value of
,
denotes a diagonal weight matrix,
2.5.2. Hypergraph Convolution
(1)Spectral Convolution on Hypergraph
①The spectral convolution of signal and filter
can be denoted as:
where denotes the element-wise Hadamard product,
②Further simplify it by Chebyshv:
③Limite Chebyshv in :
④Reduce parameter to avoid overfitting and simplify graph conv:
(2)General Spatial Convolution on Hypergraph
①Inter-Neighbor Relation:
and the vertex inter-neighbor set of hyperedge is defined as:
the hyperedge inter-neighbor set of vertex is defined as:
②Spatial hypergraph convolution:
where is the message of hyperedge
,
is the hyperedge feature of hyperedge
which is a element of hyperedge feature set
in layer
,
are the vertex message function, hyperedge update functions, hyperedge message function and vertex update function in
-th layer
(3)HGNN Convolution Layer Configurations+
①Functions in HGNN+:
②Hyper graph conv in matrix format:
2.6. Discussions
2.6.1. Hypergraph vs. Graph
①"from the random walks’ aspect, a hypergraph with edge-independent vertex weights is equivalent to a weighted graph, and a hypergraph with edge-dependent vertex weights cannot be reduced to a weighted graph."
②For undirected graph without relation to edge, and
(edge-independent).
③For those edge-dependent graph, and
. Degree matrix of vertex
and edge
can be represented by:
④之后是用随机游走还有马尔科夫链证明超图blabla,就是①,这里就不誊抄了,不是数学达人
2.6.2. HGNN/HGNN vs. GNN+
①For simple hyper graph (traditional graph):
the HGNN can be:
②Utilize rooted subtree to analogize 2-uniform hypergraph:
2.6.3. HGNN vs. HGNN+
①Difference between two convs:
上面的是对称(无向图)消息传递,而下面的是非对称(无向图)传递
②Define of directed graph:
where and
are the set of target and source vertices for hyperedge
③Specific message passing of directed hyper graph:
where
2.7. Experiments and Discussions
2.7.1. Experimental Settings
①Learning rate: 0.01 and 0.001 for 3 citation network datasets and 2 social media network datasets
②Dropout: 0.5
③Optimizer: Adam with 0.0005 weight decay
④Loss: CE:
2.7.2. Vertex Classification on the Data With Graph Structure
①Statics of datasets:
②Performance on 5 datasets when 5 samples per category were trained:
③Performance on 5 datasets when 10 samples per category were trained:
④Comparison on the effectiveness of different hyperedge groups with HGNN+:
⑤Comparison of HGNN and HGNN+:
⑥Convolution strategy comparison:
2.7.3. Vertex Classification on the Data Without Graph Structure
①3D object datasets: ModelNet40 and NTU
②Performance:
2.7.4. Vertex Classification on the Data With Hypergraph Structure
①Dataset: Cooking-200 and MovieLens2k-v2
②Performance:
2.7.5. Visualization
①t-sne visualization on Cooking-200:
2.8. THU-DeepHypergraph: An Open Toolbox of the HGNN Framework+
~
2.9. Conclusion
~
3. Reference
@ARTICLE{9795251,
author={Gao, Yue and Feng, Yifan and Ji, Shuyi and Ji, Rongrong},
journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
title={HGNN+: General Hypergraph Neural Networks},
year={2023},
volume={45},
number={3},
pages={3181-3199},
keywords={Correlation;Convolution;Data models;Task analysis;Social networking (online);Mathematical models;Representation learning;Hypergraph;classification;hypergraph convolution;representation learning},
doi={10.1109/TPAMI.2022.3182052}}