# GraphMix: Regularized Training of Graph Neural Networks for Semi-Supervised Learning

@article{Verma2019GraphMixRT, title={GraphMix: Regularized Training of Graph Neural Networks for Semi-Supervised Learning}, author={Vikas Verma and Meng Qu and Alex Lamb and Yoshua Bengio and Juho Kannala and Jian Tang}, journal={ArXiv}, year={2019}, volume={abs/1909.11715} }

We present GraphMix, a regularization technique for Graph Neural Network based semi-supervised object classification, leveraging the recent advances in the regularization of classical deep neural networks. Specifically, we propose a unified approach in which we train a fully-connected network jointly with the graph neural network via parameter sharing, interpolation-based regularization, and self-predicted-targets. Our proposed method is architecture agnostic in the sense that it can be applied… Expand

#### Figures, Tables, and Topics from this paper

#### 34 Citations

Graph Random Neural Network

- Computer Science
- ArXiv
- 2020

This work proposes the consistency regularization for Grand by leveraging the distributional consistency of unlabeled nodes in multiple augmentations, improving the generalization capacity of the model. Expand

Graph Symbiosis Learning

- Computer Science
- ArXiv
- 2021

A novel adaptive exchange method to iteratively substitute redundant channels in the weight matrix of one GNN with informative channels of another GNN in a layer-by-layer manner is proposed. Expand

Effective Training Strategies for Deep Graph Neural Networks

- Computer Science
- ArXiv
- 2020

The proposed NodeNorm regularizes deep GCNs by discouraging feature-wise correlation of hidden embeddings and increasing model smoothness with respect to input node features, and thus effectively reduces overfitting, enabling deep GNNs to compete with and even outperform shallow ones. Expand

Bag of Tricks for Training Deeper Graph Neural Networks: A Comprehensive Benchmark Study

- Computer Science
- ArXiv
- 2021

The first fair and reproducible benchmark dedicated to assessing the “tricks" of training deep GNNs is presented and it is demonstrated that an organic combo of initial connection, identity mapping, group and batch normalization has the most ideal performance on large datasets. Expand

NodeAug: Semi-Supervised Node Classification with Data Augmentation

- Computer Science
- KDD
- 2020

The NodeAug (Node-Parallel Augmentation) scheme, that creates a 'parallel universe' for each node to conduct DA, to block the undesired effects from other nodes, yields significant gains for strong GCN models on the Cora, Citeseer, Pubmed, and two co-authorship networks, with a more efficient training process thanks to the proposed subgraph mini-batch training approach. Expand

GRAPHSAD: LEARNING GRAPH REPRESENTATIONS

- 2020

Graph Neural Networks (GNNs) learn effective node/graph representations by aggregating the attributes of neighboring nodes, which commonly derives a single representation mixing the information of… Expand

Network representation learning: A macro and micro view

- 2021

Graph is a universe data structure that is widely used to organize data in real-world. Various real-word networks like the transportation network, social and academic network can be represented by… Expand

Distance-wise Graph Contrastive Learning

- Computer Science
- ArXiv
- 2020

The Distance-wise Graph Contrastive Learning (DwGCL) method, which proposes to apply CL in the graph learning adaptively by taking the received task information of each node into consideration, and can bring a clear improvement over previous GCL methods. Expand

Understanding and Resolving Performance Degradation in Graph Convolutional Networks

- Computer Science, Mathematics
- 2020

A variance-controlling technique termed Node Normalization (NodeNorm), which scales each node’s features using its own standard deviation, enables deep GCNs to outperform shallow ones in cases where deep models are needed, and to achieve comparable results with shallow ones on 6 benchmark datasets. Expand

Uncertainty-Matching Graph Neural Networks to Defend Against Poisoning Attacks

- Computer Science, Mathematics
- AAAI
- 2021

This work proposes to build a surrogate predictor that does not directly access the graph structure, but systematically extracts reliable knowledge from a standard GNN through a novel uncertainty-matching strategy, which makes UM-GNN immune to evasion attacks by design, and achieves significantly improved robustness against poisoning attacks. Expand

#### References

SHOWING 1-10 OF 58 REFERENCES

Deeper Insights into Graph Convolutional Networks for Semi-Supervised Learning

- Computer Science, Mathematics
- AAAI
- 2018

It is shown that the graph convolution of the GCN model is actually a special form of Laplacian smoothing, which is the key reason why GCNs work, but it also brings potential concerns of over-smoothing with many convolutional layers. Expand

Graph Attention Networks

- Mathematics, Computer Science
- ICLR
- 2018

We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior… Expand

Deep Convolutional Networks on Graph-Structured Data

- Computer Science
- ArXiv
- 2015

This paper develops an extension of Spectral Networks which incorporates a Graph Estimation procedure, that is test on large-scale classification problems, matching or improving over Dropout Networks with far less parameters to estimate. Expand

Pitfalls of Graph Neural Network Evaluation

- Computer Science, Mathematics
- ArXiv
- 2018

This paper performs a thorough empirical evaluation of four prominent GNN models and suggests that simpler GNN architectures are able to outperform the more sophisticated ones if the hyperparameters and the training procedure are tuned fairly for all models. Expand

Deep Graph Infomax

- Computer Science, Mathematics
- ICLR
- 2019

Deep Graph Infomax (DGI) is presented, a general approach for learning node representations within graph-structured data in an unsupervised manner that is readily applicable to both transductive and inductive learning setups. Expand

Semi-Supervised Classification with Graph Convolutional Networks

- Computer Science, Mathematics
- ICLR
- 2017

A scalable approach for semi-supervised learning on graph-structured data that is based on an efficient variant of convolutional neural networks which operate directly on graphs which outperforms related methods by a significant margin. Expand

Variational Graph Auto-Encoders

- Computer Science, Mathematics
- ArXiv
- 2016

The variational graph auto-encoder (VGAE) is introduced, a framework for unsupervised learning on graph-structured data based on the variational auto- Encoder (VAE) that can naturally incorporate node features, which significantly improves predictive performance on a number of benchmark datasets. Expand

Graph Neural Networks: A Review of Methods and Applications

- Mathematics, Computer Science
- AI Open
- 2020

A detailed review over existing graph neural network models is provided, systematically categorize the applications, and four open problems for future research are proposed. Expand

Geometric Deep Learning on Graphs and Manifolds Using Mixture Model CNNs

- Computer Science
- 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
- 2017

This paper proposes a unified framework allowing to generalize CNN architectures to non-Euclidean domains (graphs and manifolds) and learn local, stationary, and compositional task-specific features and test the proposed method on standard tasks from the realms of image-, graph-and 3D shape analysis and show that it consistently outperforms previous approaches. Expand

Batch Virtual Adversarial Training for Graph Convolutional Networks

- Computer Science, Mathematics
- ArXiv
- 2019

Two algorithms are proposed, sample-based and optimization-based BVAT, which are suitable to promote the smoothness of the model for graph-structured data by either finding virtual adversarial perturbations for a subset of nodes far from each other or generating virtual adversaries for all nodes with an optimization process. Expand