Residual Hyperbolic Graph Convolution Networks
DOI:
https://doi.org/10.1609/aaai.v38i15.29559Keywords:
ML: Learning with Manifolds, ML: Deep Learning Algorithms, ML: OptimizationAbstract
Hyperbolic graph convolutional networks (HGCNs) have demonstrated representational capabilities of modeling hierarchical-structured graphs. However, as in general GCNs, over-smoothing may occur as the number of model layers increases, limiting the representation capabilities of most current HGCN models. In this paper, we propose residual hyperbolic graph convolutional networks (R-HGCNs) to address the over-smoothing problem. We introduce a hyperbolic residual connection function to overcome the over-smoothing problem, and also theoretically prove the effectiveness of the hyperbolic residual function. Moreover, we use product manifolds and HyperDrop to facilitate the R-HGCNs. The distinctive features of the R-HGCNs are as follows: (1) The hyperbolic residual connection preserves the initial node information in each layer and adds a hyperbolic identity mapping to prevent node features from being indistinguishable. (2) Product manifolds in R-HGCNs have been set up with different origin points in different components to facilitate the extraction of feature information from a wider range of perspectives, which enhances the representing capability of R-HGCNs. (3) HyperDrop adds multiplicative Gaussian noise into hyperbolic representations, such that perturbations can be added to alleviate the over-fitting problem without deconstructing the hyperbolic geometry. Experiment results demonstrate the effectiveness of R-HGCNs under various graph convolution layers and different structures of product manifolds.Downloads
Published
2024-03-24
How to Cite
Xue, Y., Dai, J., Lu, Z., Wu, Y., & Jia, Y. (2024). Residual Hyperbolic Graph Convolution Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 38(15), 16247-16254. https://doi.org/10.1609/aaai.v38i15.29559
Issue
Section
AAAI Technical Track on Machine Learning VI