Incorporating Both Distributional and Relational Semantics in Word Representations

Daniel Fried, Kevin Duh

We investigate the hypothesis that word representations ought to incorporate both distributional and relational semantics. To this end, we employ the Alternating Direction Method of Multipliers (ADMM), which flexibly optimizes a distributional objective on raw text and a relational objective on WordNet. Preliminary results on knowledge base completion, analogy tests, and parsing show that word representations trained on both objectives can give improvements in some cases.

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment