Noisy Node Classification by Bi-level Optimization based Multi-teacher Distillation

Y Liu, Z Wu, Z Lu, C Nie, G Wen, P Hu…�- arXiv preprint arXiv�…, 2024 - arxiv.org
Y Liu, Z Wu, Z Lu, C Nie, G Wen, P Hu, X Zhu
arXiv preprint arXiv:2404.17875, 2024arxiv.org
Previous graph neural networks (GNNs) usually assume that the graph data is with clean
labels for representation learning, but it is not true in real applications. In this paper, we
propose a new multi-teacher distillation method based on bi-level optimization (namely BO-
NNC), to conduct noisy node classification on the graph data. Specifically, we first employ
multiple self-supervised learning methods to train diverse teacher models, and then
aggregate their predictions through a teacher weight matrix. Furthermore, we design a new�…
Previous graph neural networks (GNNs) usually assume that the graph data is with clean labels for representation learning, but it is not true in real applications. In this paper, we propose a new multi-teacher distillation method based on bi-level optimization (namely BO-NNC), to conduct noisy node classification on the graph data. Specifically, we first employ multiple self-supervised learning methods to train diverse teacher models, and then aggregate their predictions through a teacher weight matrix. Furthermore, we design a new bi-level optimization strategy to dynamically adjust the teacher weight matrix based on the training progress of the student model. Finally, we design a label improvement module to improve the label quality. Extensive experimental results on real datasets show that our method achieves the best results compared to state-of-the-art methods.
arxiv.org