TY - GEN
T1 - Using priming to uncover the organization of syntactic representations in neural language models
AU - Prasad, Grusha
AU - Van Schijndel, Marten
AU - Linzen, Tal
N1 - Publisher Copyright:
© 2019 Association for Computational Linguistics.
PY - 2019
Y1 - 2019
N2 - Neural language models (LMs) perform well on tasks that require sensitivity to syntactic structure. Drawing on the syntactic priming paradigm from psycholinguistics, we propose a novel technique to analyze the representations that enable such success. By establishing a gradient similarity metric between structures, this technique allows us to reconstruct the organization of the LMs' syntactic representational space. We use this technique to demonstrate that LSTM LMs' representations of different types of sentences with relative clauses are organized hierarchically in a linguistically interpretable manner, suggesting that the LMs track abstract properties of the sentence.
AB - Neural language models (LMs) perform well on tasks that require sensitivity to syntactic structure. Drawing on the syntactic priming paradigm from psycholinguistics, we propose a novel technique to analyze the representations that enable such success. By establishing a gradient similarity metric between structures, this technique allows us to reconstruct the organization of the LMs' syntactic representational space. We use this technique to demonstrate that LSTM LMs' representations of different types of sentences with relative clauses are organized hierarchically in a linguistically interpretable manner, suggesting that the LMs track abstract properties of the sentence.
UR - http://www.scopus.com/inward/record.url?scp=85084330807&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85084330807&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85084330807
T3 - CoNLL 2019 - 23rd Conference on Computational Natural Language Learning, Proceedings of the Conference
SP - 66
EP - 76
BT - CoNLL 2019 - 23rd Conference on Computational Natural Language Learning, Proceedings of the Conference
PB - Association for Computational Linguistics
T2 - 23rd Conference on Computational Natural Language Learning, CoNLL 2019
Y2 - 3 November 2019 through 4 November 2019
ER -