Large Language Model Meets Graph Neural Network in Knowledge Distillation

  • Shengxiang Hu
  • , Guobing Zou
  • , Song Yang
  • , Shiyi Lin
  • , Yanglan Gan
  • , Bofeng Zhang
  • , Yixin Chen

Research output: Contribution to journalConference articlepeer-review

Abstract

While Large Language Models (LLMs) show promise for Text-Attributed Graphs (TAGs) learning, their deployment is hindered by computational demands. Graph Neural Networks (GNNs) are efficient but struggle with TAGs’ complex semantics. We propose LinguGKD, a novel LLM-to-GNN knowledge distillation framework that enables transferring both local semantic details and global structural information from LLMs to GNNs. First, it introduces TAG-oriented instruction tuning, enhancing LLMs with graph-specific knowledge through carefully designed prompts. Next, it develops a layer-adaptive multi-scale contrastive distillation strategy aligning LLM and GNN features at multiple granularities, from node-level to graph-level. Finally, the distilled GNNs combine the semantic richness of LLMs with the computational efficiency of traditional GNNs. Experiments demonstrate that LinguGKD outperforms existing graph distillation frameworks, the distilled simple GNNs achieve comparable or superior performance to more complex GNNs and teacher LLMs, while maintaining computational efficiency. This work bridges the gap between LLMs and GNNs, facilitating advanced graph learning in resource-constrained environments and providing a framework to leverage ongoing LLM advancements for GNN improvement.

Original languageEnglish
Pages (from-to)17295-17304
Number of pages10
JournalProceedings of the AAAI Conference on Artificial Intelligence
Volume39
Issue number16
DOIs
StatePublished - Apr 11 2025
Event39th Annual AAAI Conference on Artificial Intelligence, AAAI 2025 - Philadelphia, United States
Duration: Feb 25 2025Mar 4 2025

Fingerprint

Dive into the research topics of 'Large Language Model Meets Graph Neural Network in Knowledge Distillation'. Together they form a unique fingerprint.

Cite this