Graph Convolutional Networks with Latent Label Propagation
- 주제(키워드) Backpropagation , Graph convolutional networks , Label propagation , Oversmoothing
- 주제(DDC) 006.31
- 발행기관 아주대학교
- 지도교수 신현정
- 발행년도 2023
- 학위수여년월 2023. 2
- 학위명 석사
- 학과 및 전공 일반대학원 인공지능학과
- 실제URI http://www.dcollection.net/handler/ajou/000000032565
- 본문언어 영어
- 저작권 아주대학교 논문은 저작권에 의해 보호받습니다.
초록/요약
Graph convolutional networks (GCNs) and derived models are known to be effective in semi-supervised learning, which improves the performance of the model by using both unlabeled and labeled data through graph structures. Also, GCN shows high performance in various problems such as node classification and link prediction. However, GCNs and derivatives model have the disadvantage of having to construct the model deeply to use the information of distant nodes because they reflect the graph structure through adjacency matrix. In addition, if models are constructed deeply, features of nodes are represented similarly, and the classification performance is deteriorated which define as oversmoothing. In this paper, we propose a Latent Label Propagation (LLP) model that combines label propagation with GCNs to solve the aforementioned problems. (e.g. undersmoothing, oversmoothing etc.) Different to existing GNNs models, which utilize only adjacency matrix and node features as input values, we use labels as input values and improve performance in node classification problems by adjusting the propagation degree of nodes with labels during training through parameters. In experiments, we confirm that updating node representation using features across global structure of graph shows improved performance in classification tasks rather than using only information from defined neighbor nodes by comparing performance with previously proposed models for various datasets. Finally, we evaluate the effectiveness of the label and global aggregation.
more목차
1 Introduction 1
2 Fundamentals 6
3 Proposed Methods 11
3.1 Latent Label Propagation 12
3.1.1 Smoothed Aggregation 12
3.1.2 Labeled Features 12
4 Experiments 16
4.1 Semi-supervised Classification 19
4.2 Classification Performance by Aggregation Range 23
4.3 Feature Smoothness Comparison 26
4.4 Ablation Study 29
5 Conclusion 33
References 35