Abstract
Graph neural networks (GNNs) have demonstrated impressive achievements in diverse graph tasks, and research on their expressive power has experienced significant growth in recent years. The well-known Weisfeiler and Lehman (WL) isomorphism test has been widely used to assess GNNs' ability to distinguish graph structures. However, despite being considered less expressive than other GNNs in graph-level tasks based on the WL test, two prominent GNN models, namely graph convolution networks (GCN) and attention-based graph networks (GAT), still exhibit strong performance in node-level classification tasks. In this paper, we present a comprehensive analysis of their expressive power using a novel evaluation metric: the number of linear regions. We demonstrate that by enhancing GCN with refined graph Ricci curvature, our proposed high-rank graph convolution network (HRGCN) can match or even surpass the prediction advantage of attention models. Thus, the two models exhibit equivalent node-level expressive powers. This fresh perspective highlights the evaluation of GNNs' expressive power in node-level classifications rather than solely at the graph level. Experimental results showcase that the proposed HRGCN model outperforms the state-of-the-art in various classification and prediction tasks.
Original language | English |
---|---|
Pages (from-to) | 1199-1214 |
Number of pages | 16 |
Journal | Proceedings of Machine Learning Research |
Volume | 222 |
Publication status | Published - 2023 |
Event | 15th Asian Conference on Machine Learning, ACML 2023 - Istanbul, Turkey Duration: 11 Nov 2023 → 14 Nov 2023 |
Keywords
- Expressive Equivalence
- Graph Neural Networks
- Number of Linear Regions