TY - JOUR
T1 - Extending the Design Space of Graph Neural Networks by Rethinking Folklore Weisfeiler-Lehman
AU - Feng, Jiarui
AU - Kong, Lecheng
AU - Liu, Hao
AU - Tao, Dacheng
AU - Li, Fuhai
AU - Zhang, Muhan
AU - Chen, Yixin
N1 - Publisher Copyright:
© 2023 Neural information processing systems foundation. All rights reserved.
PY - 2023
Y1 - 2023
N2 - Message passing neural networks (MPNNs) have emerged as the most popular framework of graph neural networks (GNNs) in recent years. However, their expressive power is limited by the 1-dimensional Weisfeiler-Lehman (1-WL) test. Some works are inspired by k-WL/FWL (Folklore WL) and design the corresponding neural versions. Despite the high expressive power, there are serious limitations in this line of research. In particular, (1) k-WL/FWL requires at least O(nk) space complexity, which is impractical for large graphs even when k = 3; (2) The design space of k-WL/FWL is rigid, with the only adjustable hyper-parameter being k. To tackle the first limitation, we propose an extension, (k, t)-FWL. We theoretically prove that even if we fix the space complexity to O(nk) (for any k ≥ 2) in (k, t)-FWL, we can construct an expressiveness hierarchy up to solving the graph isomorphism problem. To tackle the second problem, we propose k-FWL+, which considers any equivariant set as neighbors instead of all nodes, thereby greatly expanding the design space of k-FWL. Combining these two modifications results in a flexible and powerful framework (k, t)-FWL+. We demonstrate (k, t)FWL+ can implement most existing models with matching expressiveness. We then introduce an instance of (k, t)-FWL+ called Neighborhood2-FWL (N2-FWL), which is practically and theoretically sound. We prove that N2-FWL is no less powerful than 3-WL, and can encode many substructures while only requiring O(n2) space. Finally, we design its neural version named N2-GNN and evaluate its performance on various tasks. N2-GNN achieves record-breaking results on ZINC-Subset (0.059) outperforming previous SOTA results by 10.6%. Moreover, N2-GNN achieves new SOTA results on the BREC dataset (71.8%) among all existing high-expressive GNN methods.
AB - Message passing neural networks (MPNNs) have emerged as the most popular framework of graph neural networks (GNNs) in recent years. However, their expressive power is limited by the 1-dimensional Weisfeiler-Lehman (1-WL) test. Some works are inspired by k-WL/FWL (Folklore WL) and design the corresponding neural versions. Despite the high expressive power, there are serious limitations in this line of research. In particular, (1) k-WL/FWL requires at least O(nk) space complexity, which is impractical for large graphs even when k = 3; (2) The design space of k-WL/FWL is rigid, with the only adjustable hyper-parameter being k. To tackle the first limitation, we propose an extension, (k, t)-FWL. We theoretically prove that even if we fix the space complexity to O(nk) (for any k ≥ 2) in (k, t)-FWL, we can construct an expressiveness hierarchy up to solving the graph isomorphism problem. To tackle the second problem, we propose k-FWL+, which considers any equivariant set as neighbors instead of all nodes, thereby greatly expanding the design space of k-FWL. Combining these two modifications results in a flexible and powerful framework (k, t)-FWL+. We demonstrate (k, t)FWL+ can implement most existing models with matching expressiveness. We then introduce an instance of (k, t)-FWL+ called Neighborhood2-FWL (N2-FWL), which is practically and theoretically sound. We prove that N2-FWL is no less powerful than 3-WL, and can encode many substructures while only requiring O(n2) space. Finally, we design its neural version named N2-GNN and evaluate its performance on various tasks. N2-GNN achieves record-breaking results on ZINC-Subset (0.059) outperforming previous SOTA results by 10.6%. Moreover, N2-GNN achieves new SOTA results on the BREC dataset (71.8%) among all existing high-expressive GNN methods.
UR - http://www.scopus.com/inward/record.url?scp=85181840176&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:85181840176
SN - 1049-5258
VL - 36
JO - Advances in Neural Information Processing Systems
JF - Advances in Neural Information Processing Systems
T2 - 37th Conference on Neural Information Processing Systems, NeurIPS 2023
Y2 - 10 December 2023 through 16 December 2023
ER -