The lifted matrix-space model for semantic composition

Woo Jin Chung, Sheng Fu Wang, Samuel R. Bowman

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Abstract

    Tree-structured neural network architectures for sentence encoding draw inspiration from the approach to semantic composition generally seen in formal linguistics, and have shown empirical improvements over comparable sequence models by doing so. Moreover, adding multiplicative interaction terms to the composition functions in these models can yield significant further improvements. However, existing compositional approaches that adopt such a powerful composition function scale poorly, with parameter counts exploding as model dimension or vocabulary size grows. We introduce the Lifted Matrix-Space model, which uses a global transformation to map vector word embeddings to matrices, which can then be composed via an operation based on matrix-matrix multiplication. Its composition function effectively transmits a larger number of activations across layers with relatively few model parameters. We evaluate our model on the Stanford NLI corpus, the Multi-Genre NLI corpus, and the Stanford Sentiment Treebank and find that it consistently outperforms TreeLSTM (Tai et al., 2015), the previous best known composition function for tree-structured models.

    Original languageEnglish (US)
    Title of host publicationCoNLL 2018 - 22nd Conference on Computational Natural Language Learning, Proceedings
    PublisherAssociation for Computational Linguistics (ACL)
    Pages508-518
    Number of pages11
    ISBN (Electronic)9781948087728
    StatePublished - Jan 1 2018
    Event22nd Conference on Computational Natural Language Learning, CoNLL 2018 - Brussels, Belgium
    Duration: Oct 31 2018Nov 1 2018

    Publication series

    NameCoNLL 2018 - 22nd Conference on Computational Natural Language Learning, Proceedings

    Conference

    Conference22nd Conference on Computational Natural Language Learning, CoNLL 2018
    CountryBelgium
    CityBrussels
    Period10/31/1811/1/18

    ASJC Scopus subject areas

    • Linguistics and Language
    • Artificial Intelligence
    • Human-Computer Interaction

    Fingerprint Dive into the research topics of 'The lifted matrix-space model for semantic composition'. Together they form a unique fingerprint.

    Cite this