Abstract

Learning representations for knowledge base entities and concepts is becoming increasingly important for NLP applications. However, recent entity embedding methods have relied on structured resources that are expensive to create for new domains and corpora. We present a distantly-supervised method for jointly learning embeddings of entities and text from an unnanotated corpus, using only a list of mappings between entities and surface forms. We learn embeddings from open-domain and biomedical corpora, and compare against prior methods that rely on human-annotated text or large knowledge graph structure. Our embeddings capture entity similarity and relatedness better than prior work, both in existing biomedical datasets and a new Wikipedia-based dataset that we release to the community. Results on analogy completion and entity sense disambiguation indicate that entities and words capture complementary information that can be effectively combined for downstream use.

Original languageEnglish
Title of host publicationACL 2018 - Representation Learning for NLP, Proceedings of the 3rd Workshop
PublisherAssociation for Computational Linguistics (ACL)
Pages195-206
Number of pages12
ISBN (Electronic)9781948087438
StatePublished - 2018
Event3rd Workshop on Representation Learning for NLP, RepL4NLP 2018 at the 56th Annual Meeting of the Association for Computational Linguistics ACL 2018 - Melbourne, Australia
Duration: Jul 20 2018 → …

Publication series

NameProceedings of the Annual Meeting of the Association for Computational Linguistics
ISSN (Print)0736-587X

Conference

Conference3rd Workshop on Representation Learning for NLP, RepL4NLP 2018 at the 56th Annual Meeting of the Association for Computational Linguistics ACL 2018
Country/TerritoryAustralia
CityMelbourne
Period07/20/18 → …

Fingerprint

Dive into the research topics of 'Jointly Embedding Entities and Text with Distant Supervision'. Together they form a unique fingerprint.

Cite this