LSTMs Can Learn Basic Wh- and Relative Clause Dependencies in Norwegian

Anastasia Kobzeva, Suhas Arehalli, Tal Linzen, Dave Kush

    Research output: Contribution to conferencePaperpeer-review


    One of the key features of natural languages is that they exhibit long-distance filler-gap dependencies (FGDs): In the sentence 'What do you think the pilot sent ?' the wh-filler what is interpreted as the object of the verb sent across multiple words. The ability to establish FGDs is thought to require hierarchical syntactic structure. However, recent research suggests that recurrent neural networks (RNNs) without specific hierarchical bias can learn complex generalizations about wh-questions in English from raw text data (Wilcox et al., 2018, 2019). Across two experiments, we probe the generality of this result by testing whether a long short-term memory (LSTM) RNN model can learn basic generalizations about FGDs in Norwegian. Testing Norwegian allows us to assess whether previous results were due to distributional statistics of the English input or whether models can extract similar generalizations in languages with different syntactic distributions. We also test the model's performance on two different types of FGDs: whquestions and relative clauses, allowing us to determine if the model learns abstract generalizations about FGDs that extend beyond a single construction type. Results from Experiment 1 suggest that the model expects fillers to be paired with gaps and that this expectation generalizes across different syntactic positions. Results from Experiment 2 suggest that the model's expectations are largely unaffected by the increased linear distance between the filler and the gap. Our findings provide support for the conclusion that LSTM RNN's ability to learn basic generalizations about FGDs is robust across dependency type and language.

    Original languageEnglish (US)
    Number of pages7
    StatePublished - 2022
    Event44th Annual Meeting of the Cognitive Science Society: Cognitive Diversity, CogSci 2022 - Toronto, Canada
    Duration: Jul 27 2022Jul 30 2022


    Conference44th Annual Meeting of the Cognitive Science Society: Cognitive Diversity, CogSci 2022


    • Embedded Questions
    • Filler-Gap Dependencies
    • Neural Language Models
    • Norwegian
    • Relative Clauses

    ASJC Scopus subject areas

    • Artificial Intelligence
    • Computer Science Applications
    • Human-Computer Interaction
    • Cognitive Neuroscience


    Dive into the research topics of 'LSTMs Can Learn Basic Wh- and Relative Clause Dependencies in Norwegian'. Together they form a unique fingerprint.

    Cite this