1. Spectral Clustering with Graph Neural Networks for Graph Pooling ICML2020
This code reproduces the experimental results obtained with the MinCutPool layer as presented in the ICML 2020 paper Spectral Clustering with Graph Neural Networks for Graph Pooling F. M. Bianchi*, D. Grattarola*, C. AlippiThe official implementation of the MinCutPool layer can be found in Spektral.An implementation of MinCutPool for PyTorch is also available in Pytorch Geometric.https://github.com/FilippoMB/Spectral-Clustering-with-Graph-Neural-Networks-for-Graph-Pooling该工作发表在ICML2020, 主要是通过最小割的方式方式进行pooling,优点是可导不需要谱分解,在Image分割,图分类,聚类上取得很好的结果~
2. Strategies for Pre-training Graph Neural Networks ICLR2020
This is a Pytorch implementation of the following paper:Weihua Hu*, Bowen Liu*, Joseph Gomes, Marinka Zitnik, Percy Liang, Vijay Pande, Jure Leskovec. Strategies for Pre-training Graph Neural Networks. ICLR 2020. arXiv OpenReviewIf you make use of the code/experiment in your work, please cite our paper (Bibtex below).
@inproceedings{ hu2020pretraining, title={Strategies for Pre-training Graph Neural Networks}, author={Weihua Hu, Bowen Liu, Joseph Gomes, Marinka Zitnik, Percy Liang, Vijay Pande, Jure Leskovec}, booktitle={International Conference on Learning Representations}, year={2020}, url={https://openreview.net/forum?id=HJlWWJSFDH},
论文发表在ICLR2020, 这篇论文提出了一系列的预训练的方法,代码已经开源 相关解读
3. When Does Self-Supervision Help Graph Convolutional Networks? ICML2020
PyTorch code for When Does Self-Supervision Help Graph Convolutional Networks? [supplement]Yuning You*, Tianlong Chen*, Zhangyang Wang, Yang ShenIn ICML 2020.
Overview
Properly designed multi-task self-supervision benefits GCNs in gaining more generalizability and robustness. In this repository we verify it through performing experiments on several GCN architectures with three designed self-supervised tasks: node clustering, graph partitioning and graph completion.这篇论文非常有意思,小编后续会继续跟踪,这里先标记一下。该工作主要是通过设计三种自监督的任务:聚类,分割,补充学习更有泛化性和鲁棒性的图表示。