Enter a word, and get back our estimate of the definition encoded in the word's embedding trained on the Google News corpus.


Overview

Distributed representations of words (embeddings) have been shown to capture lexical semantics, based on their effectiveness in word similarity. In this project, we study whether it is possible to utilize the embeddings to generate dictionary definitions of words, as a more direct and transparent representation of the embeddings' semantics.

Article

Definition Modeling: Learning to Define Word Embeddings in Natural Language published in Thirty-First AAAI Conference on Artificial Intelligence (AAAI 2017). Bibtex:

@paper{AAAI1714827,
  author = {Thanapon Noraset and Chen Liang and Larry Birnbaum and Doug Downey},
  title = {Definition Modeling: Learning to Define Word Embeddings
           in Natural Language},
  conference = {AAAI Conference on Artificial Intelligence},
  year = {2017},
  url = {https://www.aaai.org/ocs/index.php/AAAI/AAAI17/paper/view/14827}
}