TransG : A Generative Mixture Model for Knowledge Graph Embedding. Recently, knowledge graph embedding, which projects symbolic entities and relations into continuous vector space, has become a new, hot topic in artificial intelligence. This paper addresses a new issue of multiple relation semantics that a relation may have multiple meanings revealed by the entity pairs associated with the corresponding triples, and proposes a novel Gaussian mixture model for embedding, TransG. The new model can discover latent semantics for a relation and leverage a mixture of relation component vectors for embedding a fact triple. To the best of our knowledge, this is the first generative model for knowledge graph embedding, which is able to deal with multiple relation semantics. Extensive experiments show that the proposed model achieves substantial improvements against the state-of-the-art baselines.
Keywords for this software
References in zbMATH (referenced in 3 articles )
Showing results 1 to 3 of 3.
- Yu, Shih-Yuan; Chhetri, Sujit Rokka; Canedo, Arquimedes; Goyal, Palash; Faruque, Mohammad Abdullah Al: Pykg2vec: a Python library for knowledge graph embedding (2021)
- He, Zhengqiu; Chen, Wenliang; Li, Zhenghua; Zhang, Wei; Shao, Hao; Zhang, Min: Syntax-aware entity representations for neural relation extraction (2019)
- Wang, Lifang; Lu, Xinyu; Jiang, Zejun; Zhang, Zhikai; Li, Ronghan; Zhao, Meng; Chen, Daqing: FRS: a simple knowledge graph embedding model for entity prediction (2019)