Quantificational features in distributional word representations

Tal Linzen, Emmanuel Dupoux, Benjamin Spector

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Abstract

    Do distributional word representations encode the linguistic regularities that theories of meaning argue they should encode? We address this question in the case of the logical properties (monotonicity, force) of quantificational words such as everything (in the object domain) and always (in the time domain). Using the vector offset approach to solving word analogies, we find that the skip-gram model of distributional semantics behaves in a way that is remarkably consistent with encoding these features in some domains, with accuracy approaching 100%, especially with mediumsized context windows. Accuracy in others domains was less impressive. We compare the performance of the model to the behavior of human participants, and find that humans performed well even where the models struggled.

    Original languageEnglish (US)
    Title of host publication*SEM 2016 - 5th Joint Conference on Lexical and Computational Semantics, Proceedings
    PublisherAssociation for Computational Linguistics (ACL)
    Pages1-11
    Number of pages11
    ISBN (Electronic)9781941643921
    DOIs
    StatePublished - 2016
    Event5th Joint Conference on Lexical and Computational Semantics, *SEM 2016 - Berlin, Germany
    Duration: Aug 11 2016Aug 12 2016

    Publication series

    Name*SEM 2016 - 5th Joint Conference on Lexical and Computational Semantics, Proceedings

    Conference

    Conference5th Joint Conference on Lexical and Computational Semantics, *SEM 2016
    Country/TerritoryGermany
    CityBerlin
    Period8/11/168/12/16

    ASJC Scopus subject areas

    • Information Systems
    • Computer Networks and Communications
    • Computer Science Applications

    Fingerprint

    Dive into the research topics of 'Quantificational features in distributional word representations'. Together they form a unique fingerprint.

    Cite this