TY - GEN
T1 - A neural attention model for sentence summarization
AU - Rush, Alexander M.
AU - Chopra, Sumit
AU - Weston, Jason
N1 - Publisher Copyright:
© 2015 Association for Computational Linguistics.
PY - 2015
Y1 - 2015
N2 - Summarization based on text extraction is inherently limited, but generation-style abstractive methods have proven challenging to build. In this work, we propose a fully data-driven approach to abstractive sentence summarization. Our method utilizes a local attention-based model that generates each word of the summary conditioned on the input sentence. While the model is structurally simple, it can easily be trained end-to-end and scales to a large amount of training data. The model shows significant performance gains on the DUC-2004 shared task compared with several strong baselines.
AB - Summarization based on text extraction is inherently limited, but generation-style abstractive methods have proven challenging to build. In this work, we propose a fully data-driven approach to abstractive sentence summarization. Our method utilizes a local attention-based model that generates each word of the summary conditioned on the input sentence. While the model is structurally simple, it can easily be trained end-to-end and scales to a large amount of training data. The model shows significant performance gains on the DUC-2004 shared task compared with several strong baselines.
UR - http://www.scopus.com/inward/record.url?scp=84957571911&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84957571911&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:84957571911
T3 - Conference Proceedings - EMNLP 2015: Conference on Empirical Methods in Natural Language Processing
SP - 379
EP - 389
BT - Conference Proceedings - EMNLP 2015
PB - Association for Computational Linguistics (ACL)
T2 - Conference on Empirical Methods in Natural Language Processing, EMNLP 2015
Y2 - 17 September 2015 through 21 September 2015
ER -