Can you tell me how to get past sesame street? Sentence-level pretraining beyond language modeling

Alex Wang, Jan Hula, Patrick Xia, Raghavendra Pappagari, R. Thomas McCoy, Roma Patel, Najoung Kim, Ian Tenney, Yinghui Huang, Katherin Yu, Shuning Jin, Berlin Chen, Benjamin van Durme, Edouard Grave, Ellie Pavlick, Samuel R. Bowman

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Abstract

    Natural language understanding has recently seen a surge of progress with the use of sentence encoders like ELMo (Peters et al., 2018a) and BERT (Devlin et al., 2019) which are pretrained on variants of language modeling. We conduct the first large-scale systematic study of candidate pretraining tasks, comparing 19 different tasks both as alternatives and complements to language modeling. Our primary results support the use language modeling, especially when combined with pretraining on additional labeled-data tasks. However, our results are mixed across pretraining tasks and show some concerning trends: In ELMo's pretrain-then-freeze paradigm, random baselines are worryingly strong and results vary strikingly across target tasks. In addition, fine-tuning BERT on an intermediate task often negatively impacts downstream transfer. In a more positive trend, we see modest gains from multitask training, suggesting the development of more sophisticated multitask and transfer learning techniques as an avenue for further research.

    Original languageEnglish (US)
    Title of host publicationACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference
    PublisherAssociation for Computational Linguistics (ACL)
    Pages4465-4476
    Number of pages12
    ISBN (Electronic)9781950737482
    StatePublished - 2020
    Event57th Annual Meeting of the Association for Computational Linguistics, ACL 2019 - Florence, Italy
    Duration: Jul 28 2019Aug 2 2019

    Publication series

    NameACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference

    Conference

    Conference57th Annual Meeting of the Association for Computational Linguistics, ACL 2019
    CountryItaly
    CityFlorence
    Period7/28/198/2/19

    ASJC Scopus subject areas

    • Language and Linguistics
    • Computer Science(all)
    • Linguistics and Language

    Fingerprint Dive into the research topics of 'Can you tell me how to get past sesame street? Sentence-level pretraining beyond language modeling'. Together they form a unique fingerprint.

  • Cite this

    Wang, A., Hula, J., Xia, P., Pappagari, R., Thomas McCoy, R., Patel, R., Kim, N., Tenney, I., Huang, Y., Yu, K., Jin, S., Chen, B., van Durme, B., Grave, E., Pavlick, E., & Bowman, S. R. (2020). Can you tell me how to get past sesame street? Sentence-level pretraining beyond language modeling. In ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (pp. 4465-4476). (ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference). Association for Computational Linguistics (ACL).