A character-level decoder without explicit segmentation for neural machine translation

Junyoung Chung, Kyunghyun Cho, Yoshua Bengio

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The existing machine translation systems, whether phrase-based or neural, have relied almost exclusively on word-level modelling with explicit segmentation. In this paper, we ask a fundamental question: can neural machine translation generate a character sequence without any explicit segmentation? To answer this question, we evaluate an attention-based encoderdecoder with a subword-level encoder and a character-level decoder on four language pairs-En-Cs, En-De, En-Ru and En-Fiusing the parallel corpora from WMT'15. Our experiments show that the models with a character-level decoder outperform the ones with a subword-level decoder on all of the four language pairs. Furthermore, the ensembles of neural models with a character-level decoder outperform the state-of-the-art non-neural machine translation systems on En-Cs, En-De and En-Fi and perform comparably on En-Ru.

Original languageEnglish (US)
Title of host publication54th Annual Meeting of the Association for Computational Linguistics, ACL 2016 - Long Papers
PublisherAssociation for Computational Linguistics (ACL)
Pages1693-1703
Number of pages11
ISBN (Electronic)9781510827585
DOIs
StatePublished - 2016
Event54th Annual Meeting of the Association for Computational Linguistics, ACL 2016 - Berlin, Germany
Duration: Aug 7 2016Aug 12 2016

Publication series

Name54th Annual Meeting of the Association for Computational Linguistics, ACL 2016 - Long Papers
Volume3

Other

Other54th Annual Meeting of the Association for Computational Linguistics, ACL 2016
CountryGermany
CityBerlin
Period8/7/168/12/16

ASJC Scopus subject areas

  • Language and Linguistics
  • Linguistics and Language

Fingerprint Dive into the research topics of 'A character-level decoder without explicit segmentation for neural machine translation'. Together they form a unique fingerprint.

Cite this