Leveraging Literals for Knowledge Graph Embeddings

Loading...
Thumbnail Image

Date

Volume

3005

Issue

Journal

CEUR workshop proceedings

Series Titel

Book Title

Proceedings of the Doctoral Consortium at ISWC 2021 - ISWC-DC 2021

Publisher

Aachen, Germany : RWTH Aachen

Link to publishers version

Abstract

Nowadays, Knowledge Graphs (KGs) have become invaluable for various applications such as named entity recognition, entity linking, question answering. However, there is a huge computational and storage cost associated with these KG-based applications. Therefore, there arises the necessity of transforming the high dimensional KGs into low dimensional vector spaces, i.e., learning representations for the KGs. Since a KG represents facts in the form of interrelations between entities and also using attributes of entities, the semantics present in both forms should be preserved while transforming the KG into a vector space. Hence, the main focus of this thesis is to deal with the multimodality and multilinguality of literals when utilizing them for the representation learning of KGs. The other task is to extract benchmark datasets with a high level of difficulty for tasks such as link prediction and triple classification. These datasets could be used for evaluating both kind of KG Embeddings, those using literals and those which do not include literals.

Description

Keywords

Collections

License

CC BY 4.0 Unported