Tree-structured composition in neural networks without tree-structured architectures

Samuel R. Bowman, Christopher D. Manning, Christopher Potts

    Research output: Contribution to journalConference articlepeer-review

    Abstract

    Tree-structured neural networks encode a particular tree geometry for a sentence in the network design. However, these models have at best only slightly outperformed simpler sequence-based models. We hypothesize that neural sequence models like LSTMs are in fact able to discover and implicitly use recursive compositional structure, at least for tasks with clear cues to that structure in the data. We demonstrate this possibility using an artificial data task for which recursive compositional structure is crucial, and find an LSTM-based sequence model can indeed learn to exploit the underlying tree structure. However, its performance consistently lags behind that of tree models, even on large training sets, suggesting that tree-structured models are more effective at exploiting recursive structure.

    Original languageEnglish (US)
    JournalCEUR Workshop Proceedings
    Volume1583
    StatePublished - 2015
    EventNIPS Workshop on Cognitive Computation, CoCo 2015 - Montreal, Canada
    Duration: Dec 11 2015Dec 12 2015

    ASJC Scopus subject areas

    • Computer Science(all)

    Fingerprint Dive into the research topics of 'Tree-structured composition in neural networks without tree-structured architectures'. Together they form a unique fingerprint.

    Cite this