[go: up one dir, main page]

Skip to main content

Advertisement

Log in

Adaptive frequency-based fully hyperbolic graph neural networks

  • Theoretical Advances
  • Published:
Pattern Analysis and Applications Aims and scope Submit manuscript

Abstract

Graph Convolutional Networks (GCNs) have attracted broad attention from industry and academia, for their excellent expressive power in terms of modeling the irregular data, e.g., skeletal data and graph-structured data. The most effective existing model may be the fully hyperbolic graph neural network. However, it involves a large number of parameters, thus consuming considerable computing resources. In this paper, we propose a model based on adaptive frequency filter and corresponding optimizer in hyperbolic space. The adaptive frequency can learn the different frequency components of the embeddings of the nodes in graph, which adaptively adjust the beneficial signals of high-frequency and low-frequency. And the optimizer is based on a subset of the orthogonal constraint, which is dedicated for the adaptive frequency with less parameters. Moreover, our model bridges the gap of hyperbolic space and the spectral space for exploring the underlying semantics of the node and relation embeddings of graph. Consequently, our model needs only to optimize the less parameters in hyperbolic space and meanwhile prevent the distortion caused by conventional manifold GCN. Experimental results show that our method achieves substantial improvements and outperforms the state-of-the-art performance in terms of node classification.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Data availability

The datasets generated and analyzed during the current study are available from the corresponding author on reasonable request.

References

  1. Wang H, Xu T, Liu Q, Lian D, Chen E, Du D, Wu H, Su W (2019) MCNE: an end-to-end framework for learning multiple conditional network representations of social network. In: Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining, pp 1064–1072

  2. Ali Z, Qi G, Muhammad K, Bhattacharyya S, Ullah I, Abro W (2022) Citation recommendation employing heterogeneous bibliographic network embedding. Neural Comput Appl 34(13):10229–10242

    Article  Google Scholar 

  3. Zhang X-M, Liang L, Liu L, Tang M-J (2021) Graph neural networks and their current applications in bioinformatics. Front Genetics 12:690049

    Article  Google Scholar 

  4. Rahevar M, Ganatra A (2023) Spatial-temporal gated graph attention network for skeleton-based action recognition. Pattern Anal Appl 26:1–11

    Article  Google Scholar 

  5. Hamilton WL, Ying R, Leskovec J (2017) Inductive representation learning on large graphs. In: Proceedings of the 31st international conference on neural information processing systems, pp 1025–1035

  6. Kipf TN, Welling M (2017) Semi-supervised classification with graph convolutional networks. In: Proceedings of the international conference on learning

  7. Chen W, Fang W, Hu G, Mahoney MW (2013) On the hyperbolicity of small-world and treelike random graphs. Internet Math 9(4):434–491

    Article  MathSciNet  MATH  Google Scholar 

  8. Sarkar R (2011) Low distortion delaunay embedding of trees in hyperbolic plane. In: International symposium on graph drawing. Springer, pp. 355–366

  9. Chami I, Ying Z, Ré C, Leskovec J (2019) Hyperbolic graph convolutional neural networks. Adv Neural Inf Process Syst 32:4868–4879

    Google Scholar 

  10. Nickel M, Kiela D (2017) Poincaré embeddings for learning hierarchical representations. Adv Neural Inf Process Syst 30:6338–6347

    Google Scholar 

  11. Krioukov D, Papadopoulos F, Kitsak M, Vahdat A, Boguná M (2010) Hyperbolic geometry of complex networks. Phys Rev E 82(3):036106

    Article  MathSciNet  Google Scholar 

  12. Muscoloni A, Thomas JM, Ciucci S, Bianconi G, Cannistraci CV (2017) Machine learning meets complex networks via coalescent embedding in the hyperbolic space. Nat Commun 8(1):1–19

    Article  Google Scholar 

  13. Dai J, Wu Y, Gao Z, Jia Y (2021) A hyperbolic-to-hyperbolic graph convolutional network. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 154–163

  14. Dong Y, Ding K, Jalaian B, Ji S, Li J (2021) AdaGNN: graph neural networks with adaptive frequency response filter. In: Proceedings of the 30th ACM international conference on information & knowledge management, pp 392–401

  15. Wu F, Souza A, Zhang T, Fifty C, Yu T, Weinberger K (2019) Simplifying graph convolutional networks. In: International conference on machine learning. PMLR, pp 6861–6871

  16. Gori M, Monfardini G, Scarselli F (2005) A new model for learning in graph domains. In: Proceedings. 2005 IEEE international joint conference on neural networks, vol 2, pp 729–734

  17. Scarselli F, Gori M, Tsoi AC, Hagenbuchner M, Monfardini G (2008) The graph neural network model. IEEE Trans Neural Netw 20(1):61–80

    Article  Google Scholar 

  18. Bruna J, Zaremba W, Szlam A, LeCun Y (2014) Spectral networks and locally connected networks on graphs. In: Proceedings of the international conference on learning representations

  19. Defferrard M, Bresson X, Vandergheynst P (2016) Convolutional neural networks on graphs with fast localized spectral filtering. Adv Neural Inf Process Syst 29

  20. Duvenaud DK, Maclaurin D, Iparraguirre J, Bombarell R, Hirzel T, Aspuru-Guzik A, Adams RP (2015) Convolutional networks on graphs for learning molecular fingerprints. Adv Neural Inf Process Syst 28

  21. Gilmer J, Schoenholz SS, Riley PF, Vinyals O, Dahl GE (2017) Neural message passing for quantum chemistry. In: International conference on machine learning. PMLR, pp. 1263–1272

  22. Papadopoulos F, Kitsak M, Serrano M, Boguná M, Krioukov D (2012) Popularity versus similarity in growing networks. Nature 489(7417):537–540

    Article  Google Scholar 

  23. Nickel M, Kiela D (2018) Learning continuous hierarchies in the Lorentz model of hyperbolic geometry. In: International conference on machine learning. PMLR, pp 3779–3788

  24. Balazevic I, Allen C, Hospedales T (2019) Multi-relational Poincaré graph embeddings. Adv Neural Inf Process Syst 32

  25. Sun Z, Chen M, Hu W, Wang C, Dai J, Zhang W (2020) Knowledge association with hyperbolic knowledge graph embeddings. In: EMNLP, pp 5704–5716

  26. Khrulkov V, Mirvakhabova L, Ustinova E, Oseledets I, Lempitsky V (2020) Hyperbolic image embeddings. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 6418–6428

  27. Liu S, Chen J, Pan L, Ngo C-W, Chua T-S, Jiang Y-G (2020) Hyperbolic visual embedding learning for zero-shot recognition. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 9273–9281

  28. Bronstein MM, Bruna J, LeCun Y, Szlam A, Vandergheynst P (2017) Geometric deep learning: going beyond Euclidean data. IEEE Signal Process Mag 34(4):18–42

    Article  Google Scholar 

  29. Liu Q, Nickel M, Kiela D (2019) Hyperbolic graph neural networks. Adv Neural Inf Process Syst 32

  30. Zhuang C, Ma Q (2018) Dual graph convolutional networks for graph-based semi-supervised classification. In: Proceedings of the 2018 world wide web conference, pp 499–508

  31. Atwood J, Towsley D (2016) Diffusion-convolutional neural networks. Adv Neural Inf Process Syst 29

  32. Dong Y, Liu N, Jalaian B, Li J (2022) Edits: modeling and mitigating data bias for graph neural networks. In: Proceedings of the ACM web conference 2022, pp 1259–1269

  33. Levie R, Monti F, Bresson X, Bronstein MM (2018) CayleyNets: graph convolutional neural networks with complex rational spectral filters. IEEE Trans Signal Process 67(1):97–109

    Article  MathSciNet  MATH  Google Scholar 

  34. Hoang N, Maehara T, Murata T (2021) Revisiting graph neural networks: graph filtering perspective. In: 2020 25th international conference on pattern recognition (ICPR). IEEE, pp 8376–8383

  35. Bo D, Wang X, Shi C, Shen H (2021) Beyond low-frequency information in graph convolutional networks. arXiv preprint arXiv:2101.00797

  36. Chen Y, Fan H, Xu B, Yan Z, Kalantidis Y, Rohrbach M, Yan S, Feng J (2019) Drop an octave: reducing spatial redundancy in convolutional neural networks with octave convolution. In: Proceedings of the IEEE/CVF international conference on computer vision, pp 3435–3444

  37. Robbin JW, Salamon DA (2011) Introduction to differential geometry. ETH, Lecture Notes, preliminary version, 18

  38. He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 770–778

  39. Xu K, Hu W, Leskovec J, Jegelka S (2018) How powerful are graph neural networks? In: Proceedings of the 7th international conference on learning representations

  40. Veličković P, Cucurull G, Casanova A, Romero A, Lio P, Bengio Y (2018) Graph attention networks. In: International conference on learning representations

  41. Boothby WM, Boothby WM (2003) An introduction to differentiable manifolds and Riemannian geometry, revised. Gulf Professional Publishing 120

  42. Fréchet M (1948) Les éléments aléatoires de nature quelconque dans un espace distancié. In: Annales de L’institut Henri Poincaré, vol 10, pp 215–310

  43. Ungar AA (2005) Analytic hyperbolic geometry: mathematical foundations and applications. World Scientific

    Book  MATH  Google Scholar 

  44. Shimizu R, Mukuta Y, Harada T (2021) Hyperbolic neural networks++. In: Proceedings of the international conference on learning representations

  45. Anderson RM, May RM (1992) Infectious diseases of humans: dynamics and control. Oxford University Press

    Google Scholar 

  46. Zeng Z, Peng Q, Mou X, Wang Y, Li R (2023) Graph neural networks with high-order polynomial spectral filters. IEEE Trans Neural Netw Learn Syst. https://doi.org/10.1109/TNNLS.2023.3263676

    Article  Google Scholar 

  47. Sen P, Namata G, Bilgic M, Getoor L, Galligher B, Eliassi-Rad T (2008) Collective classification in network data. AI Mag 29(3):93–93

    Google Scholar 

  48. Paszke A, Gross S, Chintala S, Chanan G, Yang E, DeVito Z, Lin Z, Desmaison A, Antiga L, Lerer A (2017) Automatic differentiation in PyTorch

  49. Kingma DP, Ba J (2015) Adam: a method for stochastic optimization. In: Proceedings of ICLR

  50. Van der Maaten L, Hinton G (2008) Visualizing data using t-SNE. J Mach Learn Res 9(11):2579

    MATH  Google Scholar 

  51. Singh R, Gill SS (2023) Edge AI: a survey. Internet of Things and Cyber-Physical Systems

  52. Elias VRM, Gogineni VC, Martins WA, Werner S (2022) Kernel regression over graphs using random Fourier features. IEEE Trans Signal Process 70:936–949

    Article  MathSciNet  Google Scholar 

  53. Nikhitha NK, Afzal A, Asharaf S (2021) Deep kernel machines: a survey. Pattern Anal Appl 24:537–556

    Article  Google Scholar 

Download references

Acknowledgement

This paper is supported by National Natural Science Foundation of China (Grant No. 62076193).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to KuiZhi Mei.

Ethics declarations

Conflict of interest

The authors have no relevant financial or non-financial interests to disclose.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wei, F., Ping, M. & Mei, K. Adaptive frequency-based fully hyperbolic graph neural networks. Pattern Anal Applic 26, 1741–1751 (2023). https://doi.org/10.1007/s10044-023-01201-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10044-023-01201-8

Keywords

Navigation