UM
KGEL: A novel end-to-end embedding learning framework for knowledge graph completion
Zeb,Adnan1; Ul Haq,Anwar1; Zhang,Defu1; Chen,Junde1; Gong,Zhiguo2,3
2021-04-01
Source PublicationExpert Systems with Applications
ISSN0957-4174
Volume167
AbstractKnowledge graphs (KGs) have recently become increasingly popular due to the broad range of essential applications in various downstream tasks including intelligent search, personalized recommendations, intelligent financial data analytics, etc. During an automated construction of a KG, the knowledge facts from multiple knowledge sources are automatically extracted in the form of triples, and these observed triples are used to derive new unobserved triples for KG completion (also known as link prediction). State-of-the-art link prediction methods are known to be primarily KG embedding models, among which tensor factorization models have recently drawn much attention due to their scalability and expressive feature embeddings, and hence, perform well for link prediction. However, these embedding models consider each KG triple individually and fail to capture the useful information present in the neighborhood of a node. To this end, we propose a novel end-to-end KG embedding learning framework that consists of an encoder of a dual weighted graph convolutional network, and a decoder of a novel fully expressive tensor factorization model. The proposed encoder extends weighted graph convolutional network to generate two rich and high quality embedding vectors for each node by aggregating information from the neighboring nodes. The proposed decoder has a flexible and powerful tensor representation form of the Tensor Train decomposition that takes benefit of the two representations of each node in its embedding space to accurately model the KG triples. We also derive a bound on the size of the embeddings for full expressivity and show that our proposed tensor factorization model is fully expressive. Additionally, we show the relationship of our tensor factorization model to previous tensor factorization models. The experimental results show the effectiveness of the proposed framework that consistently marks performance gains over several previous models on recent standard link prediction datasets.
KeywordKnowledge graph Link prediction Tensor factorization Tensor train decomposition Weighted graph convolutional network
DOI10.1016/j.eswa.2020.114164
URLView the original
Language英语
Fulltext Access
Citation statistics
Cited Times [WOS]:1   [WOS Record]     [Related Records in WOS]
Document TypeJournal article
CollectionUniversity of Macau
Corresponding AuthorZhang,Defu
Affiliation1.School of Informatics,Xiamen University,Fujian,361005,China
2.The State Key Laboratory of Internet of Things for Smart City,University of Macau,Macau,China
3.Department of Computer and Information Science,University of Macau,Macau,China
Recommended Citation
GB/T 7714
Zeb,Adnan,Ul Haq,Anwar,Zhang,Defu,et al. KGEL: A novel end-to-end embedding learning framework for knowledge graph completion[J]. Expert Systems with Applications,2021,167.
APA Zeb,Adnan,Ul Haq,Anwar,Zhang,Defu,Chen,Junde,&Gong,Zhiguo.(2021).KGEL: A novel end-to-end embedding learning framework for knowledge graph completion.Expert Systems with Applications,167.
MLA Zeb,Adnan,et al."KGEL: A novel end-to-end embedding learning framework for knowledge graph completion".Expert Systems with Applications 167(2021).
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Zeb,Adnan]'s Articles
[Ul Haq,Anwar]'s Articles
[Zhang,Defu]'s Articles
Baidu academic
Similar articles in Baidu academic
[Zeb,Adnan]'s Articles
[Ul Haq,Anwar]'s Articles
[Zhang,Defu]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Zeb,Adnan]'s Articles
[Ul Haq,Anwar]'s Articles
[Zhang,Defu]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.