TY - GEN
T1 - PALT
T2 - 2022 Findings of the Association for Computational Linguistics: EMNLP 2022
AU - Shen, Jianhao
AU - Wang, Chenguang
AU - Yuan, Ye
AU - Han, Jiawei
AU - Ji, Heng
AU - Sen, Koushik
AU - Zhang, Ming
AU - Song, Dawn
N1 - Publisher Copyright:
© 2022 Association for Computational Linguistics.
PY - 2022
Y1 - 2022
N2 - This paper presents a parameter-lite transfer learning approach of pretrained language models (LM) for knowledge graph (KG) completion. Instead of finetuning, which modifies all LM parameters, we only tune a few new parameters while keeping the original LM parameters fixed. We establish this via reformulating KG completion as a “fill-in-the-blank” task, and introducing a parameter-lite encoder on top of the original LMs. We show that, by tuning far fewer parameters than finetuning, LMs transfer non-trivially to most tasks and reach competitiveness with prior state-of-the-art approaches. For instance, we outperform the fully finetuning approaches on a KG completion benchmark by tuning only 1% of the parameters.
AB - This paper presents a parameter-lite transfer learning approach of pretrained language models (LM) for knowledge graph (KG) completion. Instead of finetuning, which modifies all LM parameters, we only tune a few new parameters while keeping the original LM parameters fixed. We establish this via reformulating KG completion as a “fill-in-the-blank” task, and introducing a parameter-lite encoder on top of the original LMs. We show that, by tuning far fewer parameters than finetuning, LMs transfer non-trivially to most tasks and reach competitiveness with prior state-of-the-art approaches. For instance, we outperform the fully finetuning approaches on a KG completion benchmark by tuning only 1% of the parameters.
UR - https://www.scopus.com/pages/publications/85149889015
U2 - 10.18653/v1/2022.findings-emnlp.47
DO - 10.18653/v1/2022.findings-emnlp.47
M3 - Conference contribution
AN - SCOPUS:85149889015
T3 - Findings of the Association for Computational Linguistics: EMNLP 2022
SP - 3862
EP - 3876
BT - Findings of the Association for Computational Linguistics
A2 - Goldberg, Yoav
A2 - Kozareva, Zornitsa
A2 - Zhang, Yue
PB - Association for Computational Linguistics (ACL)
Y2 - 7 December 2022 through 11 December 2022
ER -