TY - GEN
T1 - Graph Contrastive Learning Meets Graph Meta Learning
T2 - 33rd ACM Web Conference, WWW 2024
AU - Liu, Hao
AU - Feng, Jiarui
AU - Kong, Lecheng
AU - Tao, Dacheng
AU - Chen, Yixin
AU - Zhang, Muhan
N1 - Publisher Copyright:
© 2024 Owner/Author.
PY - 2024/5/13
Y1 - 2024/5/13
N2 - Graph Neural Networks (GNNs) have become popular tools for Graph Representation Learning (GRL). One fundamental problem is few-shot node classification. Most existing methods follow the meta learning paradigm, showing the ability of fast generalization to few-shot tasks. However, recent works indicate that graph contrastive learning combined with fine-tuning can significantly outperform meta learning methods. Despite the empirical success, there is limited understanding of the reasons behind it. In our study, we first identify two crucial advantages of contrastive learning over meta learning, including (1) the comprehensive utilization of graph nodes and (2) the power of graph augmentations. To integrate the strength of both contrastive learning and meta learning on the few-shot node classification tasks, we introduce a new paradigm-Contrastive Few-Shot Node Classification (COLA). Specifically, COLA identifies semantically similar nodes only from augmented graphs, enabling the construction of meta-tasks without label information. Therefore, COLA can incorporate all nodes to construct meta-tasks, reducing the risk of overfitting. Through extensive experiments, we validate the necessity of each component in our design and demonstrate that COLA achieves new state-of-the-art on all tasks.
AB - Graph Neural Networks (GNNs) have become popular tools for Graph Representation Learning (GRL). One fundamental problem is few-shot node classification. Most existing methods follow the meta learning paradigm, showing the ability of fast generalization to few-shot tasks. However, recent works indicate that graph contrastive learning combined with fine-tuning can significantly outperform meta learning methods. Despite the empirical success, there is limited understanding of the reasons behind it. In our study, we first identify two crucial advantages of contrastive learning over meta learning, including (1) the comprehensive utilization of graph nodes and (2) the power of graph augmentations. To integrate the strength of both contrastive learning and meta learning on the few-shot node classification tasks, we introduce a new paradigm-Contrastive Few-Shot Node Classification (COLA). Specifically, COLA identifies semantically similar nodes only from augmented graphs, enabling the construction of meta-tasks without label information. Therefore, COLA can incorporate all nodes to construct meta-tasks, reducing the risk of overfitting. Through extensive experiments, we validate the necessity of each component in our design and demonstrate that COLA achieves new state-of-the-art on all tasks.
KW - few-shot learning
KW - node classification
KW - unsupervised learning
UR - http://www.scopus.com/inward/record.url?scp=85194054223&partnerID=8YFLogxK
U2 - 10.1145/3589334.3645367
DO - 10.1145/3589334.3645367
M3 - Conference contribution
AN - SCOPUS:85194054223
T3 - WWW 2024 - Proceedings of the ACM Web Conference
SP - 365
EP - 376
BT - WWW 2024 - Proceedings of the ACM Web Conference
PB - Association for Computing Machinery, Inc
Y2 - 13 May 2024 through 17 May 2024
ER -