TY - GEN
T1 - A fast unified model for parsing and sentence understanding
AU - Bowman, Samuel R.
AU - Gupta, Raghav
AU - Gauthier, Jon
AU - Manning, Christopher D.
AU - Rastogi, Abhinav
AU - Potts, Christopher
PY - 2016
Y1 - 2016
N2 - Tree-structured neural networks exploit valuable syntactic parse information as they interpret the meanings of sentences. However, they suffer from two key technical problems that make them slow and unwieldy for large-scale NLP tasks: they usually operate on parsed sentences and they do not directly support batched computation. We address these issues by introducing the Stack-augmented Parser-Interpreter Neural Network (SPINN), which combines parsing and interpretation within a single tree-sequence hybrid model by integrating tree-structured sentence interpretation into the linear sequential structure of a shiftreduce parser. Our model supports batched computation for a speedup of up to 25x over other tree-structured models, and its integrated parser can operate on unparsed data with little loss in accuracy. We evaluate it on the Stanford NLI entailment task and show that it significantly outperforms other sentence-encoding models.
AB - Tree-structured neural networks exploit valuable syntactic parse information as they interpret the meanings of sentences. However, they suffer from two key technical problems that make them slow and unwieldy for large-scale NLP tasks: they usually operate on parsed sentences and they do not directly support batched computation. We address these issues by introducing the Stack-augmented Parser-Interpreter Neural Network (SPINN), which combines parsing and interpretation within a single tree-sequence hybrid model by integrating tree-structured sentence interpretation into the linear sequential structure of a shiftreduce parser. Our model supports batched computation for a speedup of up to 25x over other tree-structured models, and its integrated parser can operate on unparsed data with little loss in accuracy. We evaluate it on the Stanford NLI entailment task and show that it significantly outperforms other sentence-encoding models.
UR - http://www.scopus.com/inward/record.url?scp=85011928251&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85011928251&partnerID=8YFLogxK
U2 - 10.18653/v1/p16-1139
DO - 10.18653/v1/p16-1139
M3 - Conference contribution
AN - SCOPUS:85011928251
T3 - 54th Annual Meeting of the Association for Computational Linguistics, ACL 2016 - Long Papers
SP - 1466
EP - 1477
BT - 54th Annual Meeting of the Association for Computational Linguistics, ACL 2016 - Long Papers
PB - Association for Computational Linguistics (ACL)
T2 - 54th Annual Meeting of the Association for Computational Linguistics, ACL 2016
Y2 - 7 August 2016 through 12 August 2016
ER -