Enter a word, and get back our estimate of the definition encoded in the word's embedding trained on the Google News corpus.


Overview

Distributed representations of words (embeddings) have been shown to capture lexical semantics, based on their effectiveness in word similarity. In this project, we study whether it is possible to utilize the embeddings to generate dictionary definitions of words, as a more direct and transparent representation of the embeddings' semantics.

Article

Definition Modeling: Learning to Define Word Embeddings in Natural Language published in Thirty-First AAAI Conference on Artificial Intelligence (AAAI 2017). Bibtex:

@inproceedings{AAAI1714827,
    title = {Definition Modeling: Learning to Define Word Embeddings 
             in Natural Language},
    eventtitle = {AAAI Conference on Artificial Intelligence},
    booktitle = {The Proceedings of the  Thirty-First {AAAI} 
                 Conference on Artificial Intelligence},
    author = {Noraset, Thanapon and Liang, Chen and 
              Birnbaum, Larry and Downey, Doug},
    year = {2017}
}

Supplementary