The Impact of Depth on Compositional Generalization in Transformer Language Models

Jackson Petty, Sjoerd van Steenkiste, Ishita Dasgupta, Fei Sha, Dan Garrette, Tal Linzen

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Abstract

    To process novel sentences, language models (LMs) must generalize compositionally—combine familiar elements in new ways. What aspects of a model’s structure promote compositional generalization? Focusing on transformers, we test the hypothesis, motivated by theoretical and empirical work, that deeper transformers generalize more compositionally. Simply adding layers increases the total number of parameters; to address this confound between depth and size, we construct three classes of models which trade off depth for width such that the total number of parameters is kept constant (41M, 134M and 374M parameters). We pretrain all models as LMs and fine-tune them on tasks that test for compositional generalization. We report three main conclusions: (1) after fine-tuning, deeper models generalize more compositionally than shallower models do, but the benefit of additional layers diminishes rapidly; (2) within each family, deeper models show better language modeling performance, but returns are similarly diminishing; (3) the benefits of depth for compositional generalization cannot be attributed solely to better performance on language modeling. Because model latency is approximately linear in the number of layers, these results lead us to the recommendation that, with a given total parameter budget, transformers can be made shallower than is typical without sacrificing performance.

    Original languageEnglish (US)
    Title of host publicationLong Papers
    EditorsKevin Duh, Helena Gomez, Steven Bethard
    PublisherAssociation for Computational Linguistics (ACL)
    Pages7232-7245
    Number of pages14
    ISBN (Electronic)9798891761148
    DOIs
    StatePublished - 2024
    Event2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL 2024 - Hybrid, Mexico City, Mexico
    Duration: Jun 16 2024Jun 21 2024

    Publication series

    NameProceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL 2024
    Volume1

    Conference

    Conference2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL 2024
    Country/TerritoryMexico
    CityHybrid, Mexico City
    Period6/16/246/21/24

    ASJC Scopus subject areas

    • Computer Networks and Communications
    • Hardware and Architecture
    • Information Systems
    • Software

    Fingerprint

    Dive into the research topics of 'The Impact of Depth on Compositional Generalization in Transformer Language Models'. Together they form a unique fingerprint.

    Cite this