Inducing nonlocal constraints from baseline phonotactics

Maria Gouskova, Gillian Gallagher

    Research output: Contribution to journalArticlepeer-review


    Nonlocal phonological patterns such as vowel harmony and long-distance consonant assimilation and dissimilation motivate representations that include only the interacting segments—projections. We present an implemented computational learner that induces projections based on phonotactic properties of a language that are observable without nonlocal representations. The learner builds on the base grammar induced by the MaxEnt Phonotactic Learner (Hayes and Wilson 2008). Our model searches this baseline grammar for constraints that suggest nonlocal interactions, capitalizing on the observations that (a) nonlocal interactions can be seen in trigrams if the language has simple syllable structure, and (b) nonlocally interacting segments define a natural class. We show that this model finds nonlocal restrictions on laryngeal consonants in corpora of Quechua and Aymara, and vowel co-occurrence restrictions in Shona.

    Original languageEnglish (US)
    Pages (from-to)77-116
    Number of pages40
    JournalNatural Language and Linguistic Theory
    Issue number1
    StatePublished - Feb 1 2020


    • Aymara
    • Computational modeling
    • Consonant dissimilation
    • Consonant harmony
    • Corpus phonology
    • Inductive learning
    • Learnability
    • Nonlocal phonology
    • Phonology
    • Phonotactics
    • Quechua
    • Shona
    • Vowel harmony

    ASJC Scopus subject areas

    • Language and Linguistics
    • Linguistics and Language


    Dive into the research topics of 'Inducing nonlocal constraints from baseline phonotactics'. Together they form a unique fingerprint.

    Cite this