Abstract
We present an effective graph neural network (GNN)-based knowledge graph embedding model, which we name WGE, to capture entity- and relation-focused graph structures. Given a knowledge graph, WGE builds a single undirected entity-focused graph that views entities as nodes. WGE also constructs another single undirected graph from relation-focused constraints, which views entities and relations as nodes. WGE then proposes a GNN-based architecture to better learn vector representations of entities and relations from these two single entity- and relation-focused graphs. WGE feeds the learned entity and relation representations into a weighted score function to return the triple scores for knowledge graph completion. Experimental results show that WGE outperforms strong baselines on seven benchmark datasets for knowledge graph completion.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
- 2.
- 3.
Note that the experimental setup is the same for both QuatE and WGE for a fair comparison as WGE uses QuatE for decoding. Zhang et al. [39] reported MRR at 0.348 and Hits@10 at 55.0% on FB15K-237 for QuatE. However, we could not reproduce those scores.
- 4.
Our training protocol monitors the MRR score on the validation set to select the best model checkpoint.
References
Balažević, I., Allen, C., Hospedales, T.M.: TuckER: tensor factorization for knowledge graph completion. In: EMNLP, pp. 5185–5194 (2019)
Berant, J., Chou, A., Frostig, R., Liang, P.: Semantic parsing on freebase from question-answer pairs. In: EMNLP, pp. 1533–1544 (2013)
Bordes, A., Usunier, N., García-Durán, A., Weston, J., Yakhnenko, O.: Translating embeddings for modeling multi-relational data. In: NIPS, pp. 2787–2795 (2013)
Bordes, A., Weston, J., Collobert, R., Bengio, Y.: Learning structured embeddings of knowledge bases. In: AAAI, pp. 301–306 (2011)
Chen, X., Zhou, Z., Gao, M., Shi, D., Husen, M.N.: Knowledge representation combining quaternion path integration and depth-wise atrous circular convolution. In: UAI, pp. 336–345 (2022)
Demir, C., Ngomo, A.-C.N.: Convolutional complex knowledge graph embeddings. In: Verborgh, R., et al. (eds.) ESWC 2021. LNCS, vol. 12731, pp. 409–424. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-77385-4_24
Dettmers, T., Minervini, P., Stenetorp, P., Riedel, S.: Convolutional 2D knowledge graph embeddings. In: AAAI, pp. 1811–1818 (2018)
Dutta, S., Weikum, G.: Cross-document co-reference resolution using sample-based clustering with knowledge enrichment. Trans. ACL 3, 15–28 (2015)
Fader, A., Zettlemoyer, L., Etzioni, O.: Open question answering over curated and extracted knowledge bases. In: KDD, pp. 1156–1165 (2014)
Ferrucci, D.A.: Introduction to “this is Watson’’. IBM J. Res. Dev. 56(3), 235–249 (2012)
Gesese, G.A., Alam, M., Sack, H.: LiterallyWikidata - a benchmark for knowledge graph completion using literals. In: Hotho, A., et al. (eds.) ISWC 2021. LNCS, vol. 12922, pp. 511–527. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-88361-4_30
Glorot, X., Bengio, Y.: Understanding the difficulty of training deep feedforward neural networks. In: AISTATS, pp. 249–256 (2010)
Hamilton, W.L., Ying, R., Leskovec, J.: Inductive representation learning on large graphs. In: NeurIPS (2017)
Hamilton, W.L., Ying, R., Leskovec, J.: Representation learning on graphs: methods and applications. IEEE Data Eng. Bull. 40(3), 52–74 (2018)
Hamilton, W.R.: On quaternions; or on a new system of imaginaries in algebra. London Edinb. Dublin Philos. Mag. J. Sci. 25(163), 10–13 (1844)
Kingma, D., Ba, J.: Adam: a method for stochastic optimization. In: ICLR (2015)
Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. In: ICLR (2017)
Krishnamurthy, J., Mitchell, T.: Weakly supervised training of semantic parsers. In: EMNLP-CoNLL, pp. 754–765 (2012)
Levi, F.W.: Finite Geometrical Systems: Six Public Lectues Delivered in February, 1940, at the University of Calcutta. University of Calcutta (1942)
Nguyen, D.Q., Nguyen, D.Q., Nguyen, T.D., Phung, D.: Convolutional neural network-based model for knowledge base completion and its application to search personalization. Semant. Web 10(5), 947–960 (2019)
Nguyen, D.Q., Nguyen, T.D., Phung, D.: Quaternion graph neural networks. In: ACML (2021)
Nguyen, D.Q., Tong, V., Phung, D., Nguyen, D.Q.: Node co-occurrence based graph neural networks for knowledge graph link prediction. In: WSDM, pp. 1589–1592 (2022)
Nguyen, D.Q.: A survey of embedding models of entities and relationships for knowledge graph completion. In: TextGraphs, pp. 1–14 (2020)
Nickel, M., Rosasco, L., Poggio, T.: Holographic embeddings of knowledge graphs. In: AAAI, pp. 1955–1961 (2016)
Parcollet, T., Morchid, M., Linarès, G.: A survey of quaternion neural networks. Artif. Intell. Rev. 53, 2957–2982 (2020)
Paszke, A., Gross, S., Massa, F., et al.: Pytorch: an imperative style, high-performance deep learning library. In: NeurIPS, pp. 8024–8035 (2019)
Ponzetto, S.P., Strube, M.: Exploiting semantic role labeling, wordnet and Wikipedia for coreference resolution. In: NAACL, pp. 192–199 (2006)
Safavi, T., Koutra, D.: CoDEx: a comprehensive knowledge graph completion benchmark. In: EMNLP, pp. 8328–8350 (2020)
Scarselli, F., Gori, M., Tsoi, A.C., Hagenbuchner, M., Monfardini, G.: The graph neural network model. IEEE Trans. Neural Netw. 20(1), 61–80 (2009)
Schlichtkrull, M., Kipf, T.N., Bloem, P., van den Berg, R., Titov, I., Welling, M.: Modeling relational data with graph convolutional networks. In: Gangemi, A., et al. (eds.) ESWC 2018. LNCS, vol. 10843, pp. 593–607. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-93417-4_38
Shang, C., Tang, Y., Huang, J., Bi, J., He, X., Zhou, B.: End-to-end structure-aware convolutional networks for knowledge base completion. In: AAAI, pp. 3060–3067 (2019)
Toutanova, K., Chen, D.: Observed versus latent features for knowledge base and text inference. In: CVSC, pp. 57–66 (2015)
Trouillon, T., Welbl, J., Riedel, S., Gaussier, É., Bouchard, G.: Complex embeddings for simple link prediction. In: ICML, pp. 2071–2080 (2016)
Vashishth, S., Sanyal, S., Nitin, V., Talukdar, P.: Composition-based multi-relational graph convolutional networks. In: ICLR (2020)
Veličković, P., Cucurull, G., Casanova, A., Romero, A., Liò, P., Bengio, Y.: Graph attention networks. In: ICLR (2018)
Wu, Z., Pan, S., Chen, F., Long, G., Zhang, C., Yu, P.S.: A comprehensive survey on graph neural networks. IEEE Trans. Neural Netw. Learn. Syst. 32(1), 4–24 (2021)
Yang, B., Yih, W.T., He, X., Gao, J., Deng, L.: Embedding entities and relations for learning and inference in knowledge bases. In: ICLR (2015)
Zhang, D., Yin, J., Zhu, X., Zhang, C.: Network representation learning: a survey. IEEE Trans. Big Data 6, 3–28 (2020)
Zhang, S., Tay, Y., Yao, L., Liu, Q.: Quaternion knowledge graph embeddings. In: NeurIPS, pp. 2731–2741 (2019)
Zhang, Y., Yao, Q., Dai, W., Chen, L.: AutoSF: searching scoring functions for knowledge graph embedding. In: ICDE, pp. 433–444 (2020)
Acknowledgment
Most of this work was done while Vinh Tong was a research resident at VinAI Research, Vietnam.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendix
Appendix
The hyper-complex vector space has recently been considered on the Quaternion space [15] consisting of one real and three separate imaginary axes. It provides highly expressive computations through the Hamilton product compared to the Euclidean and complex vector spaces. We provide key notations and operations related to the Quaternion space required for our later development. Additional details can further be found in [25].
A quaternion \(q \in \mathbb {H}\) is a hyper-complex number consisting of one real and three separate imaginary components [15] defined as:
where \(q_r, q_i, q_j, q_k \in \mathbb {R}\), and \(\boldsymbol{\textsf{i}}, \boldsymbol{\textsf{j}}, \boldsymbol{\textsf{k}}\) are imaginary units that \(\boldsymbol{\textsf{i}}^2 = \boldsymbol{\textsf{j}}^2 = \boldsymbol{\textsf{k}}^2 = \boldsymbol{\textsf{i}}\boldsymbol{\textsf{j}}\boldsymbol{\textsf{k}} = -1\). The operations for the Quaternion algebra are defined as follows:
Addition. The addition of two quaternions q and p is defined as:
Norm. The norm \(\Vert q\Vert \) of a quaternion q is computed as:
And the normalized or unit quaternion \(q^\triangleleft \) is defined as: \(q^\triangleleft = \frac{q}{\Vert q\Vert }\)
Scalar Multiplication. The multiplication of a scalar \(\lambda \) and q is computed as follows:
Conjugate. The conjugate \(q^*\) of a quaternion q is defined as:
Hamilton Product. The Hamilton product \(\otimes \) (i.e., the quaternion multiplication) of two quaternions q and p is defined as:
We can express the Hamilton product of q and p in the following form:
The Hamilton product of two quaternion vectors \(\boldsymbol{q}\) and \(\boldsymbol{p} \in \mathbb {H}^n\) is computed as:
where \(\circ \) denotes the element-wise product. We note that the Hamilton product is not commutative, i.e., \(q \otimes p \ne p \otimes q\).
We can derived a product of a quaternion matrix \(\boldsymbol{W} \in \mathbb {H}^{m \times n}\) and a quaternion vector \(\boldsymbol{p} \in \mathbb {H}^{n}\) from Eq. 23 as follow:
where \(\boldsymbol{p}_r\), \(\boldsymbol{p}_i\), \(\boldsymbol{p}_j\), and \(\boldsymbol{p}_k \in \mathbb {R}^n\) are real vectors; and \(\boldsymbol{W}_r\), \(\boldsymbol{W}_i\), \(\boldsymbol{W}_j\), and \(\boldsymbol{W}_k \in \mathbb {R}^{m \times n}\) are real matrices.
Quaternion-Inner Product. The quaternion-inner product \(\bullet \) of two quaternion vectors \(\boldsymbol{q}\) and \(\boldsymbol{p} \in \mathbb {H}^n\) returns a scalar as:
Quaternion Element-Wise Product. We further define the element-wise product of two quaternions vector \(\boldsymbol{q}\) and \(\boldsymbol{p} \in \mathbb {H}^n\) as follow:
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Tong, V., Nguyen, D.Q., Phung, D., Nguyen, D.Q. (2023). Two-View Graph Neural Networks for Knowledge Graph Completion. In: Pesquita, C., et al. The Semantic Web. ESWC 2023. Lecture Notes in Computer Science, vol 13870. Springer, Cham. https://doi.org/10.1007/978-3-031-33455-9_16
Download citation
DOI: https://doi.org/10.1007/978-3-031-33455-9_16
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-33454-2
Online ISBN: 978-3-031-33455-9
eBook Packages: Computer ScienceComputer Science (R0)